Intelligent Judicial Research Based on BERT Sentence Embedding and Multi-Level Attention CNNs
Download as PDF
DOI: 10.23977/iset.2019.038
Author(s)
Bin Yang, Dakui Li, Nanhai Yang
Corresponding Author
Dakui Li
ABSTRACT
The multi-label text classifications of accusations and relevant law articles are important tasks in the construction of intelligent justice. In this paper, we apply multi-level attention mechanisms to the multi-core CNN, and combine the BERT sentence embedding to propose the BERT-ACNN for the tasks. The architecture can selectively extract features and incorporate features extracted by the BERT pre-training language model. Experiments show that our model can achieve better results on the CAIL2018-Small dataset than Average Pooling models, RNNs and CNN. Finally, we improve the performance of BERT-ACNN by oversampling and increasing the number of convolution layers.
KEYWORDS
Multi-label text classifications,Intelligent Justice, BERT Sentence Embedding, Attention, CNN