American Journal of Computer Science and Technology

Submit a Manuscript

Publishing with us to make your research visible to the widest possible audience.

Propose a Special Issue

Building a community of authors and readers to discuss the latest research and develop new ideas.

Chinese NER with Softlexion and Residual Gated CNNs

The increment of accuracy and speed on Named Entity Recognition (NER), a key task in natural language processing, can further enhance downstream tasks. The method of residual gated convolution and attention mechanism is proposed to address the problem of insufficient recognition of nested entities and ambiguous entities by convolutional layers in the absence of context. It emphasizes local continuous features fusion to global ones to better obtain contextual semantic information in the stacked convolutional layer. Moreover, the optimized embedding layer with fusing character and lexical information by introducing a dictionary combines with a pre-trained BERT model containing a priori semantic effects, and the decoding layer in an entity-level method to alleviate the problem of nested entities and ambiguous entities in long-sequence text. In order to reduce abundant parameters of Bert model, during the training process, only the residual gated convolutional layer is iterated after fixing Bert layer parameters. After experiments on MSRA corpus, the result of entity recognition task in BERT-softlexion-RGCNN-GP model outperforms other models, with an F1 value of 94.96%, and the training speed is also better than that of the bidirectional LSTM model. Our model not only maintains a more efficient training speed but also recognizes Chinese entities more precisely, which is of practical value for fields required accuracy and speed.

NER, BERT, Lexion, Residual Gated CNNs

Zhang Yinglin, Liu Changhui, Huang Shufen. (2023). Chinese NER with Softlexion and Residual Gated CNNs. American Journal of Computer Science and Technology, 6(2), 67-73.

Copyright © 2023 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License ( which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

1. Jianlin Su. (May. 01, 2021). GlobalPointer: A unified approach to nested and non-nested NERs [Blog post]. Retrieved from
2. LI L S, HE H L, LIU S, et al. Biomedical named entity recognition based on word representation method [J]. Journal of Chinese Computer Systems, 2016, 37 (2): 302-307.
3. WANG J, LI Y, JIANG X C, et al. Named entity recognition of LSTM based on hierarchical residual connection [J]. Journal of Jiangsu University (Natural Science Edition), 2022, 43 (4): 446-452.
4. XU X B, WANG T, KANG R, et al. Multi-feature Chinese named entity recognition [J]. Journal of Sichuan University (Natural Science Edition), 2022, 59 (2): 022003.
5. WANG H B, GAO H K, SHEN Q, et al. Thai language names, place names, and organization names entity recognition [J]. Journal of System Simulation, 2019, 31 (5). 1010-1018.
6. LI N. Automatic extraction of alias in ancient local chronicles based on conditional random fields [J]. Journal of Chinese Information Processing, 2018, 32 (11): 41-48.
7. ZOU B W, QIAN Z, CHEN Z C, et al. Negation and un-certainty information extraction oriented to natural language text [J]. Journal of Software, 2016, 27 (2): 309-328.
8. Collobert R, Weston J, Bottou L, et al. Natural Language Processing (Almost) from Scratch [J]. The Journal of Machine Learning Research, 2011, 12 (1): 2493-253.
9. HAMMERTON J. Named entity recognition with long short-term memory [C] //Conference on Natural Language Learning at HLT-NAACL. NJ. Association for Computational Linguistics, 2003.
10. Huang Zhiheng, Xu Wei, Yu Kai. Bidirectional LSTM-CRF models for sequence tagging [J]. ArXiv Preprint ArXiv: 1508.01991, 2015: 1-10.
11. LAMPLE G, BALLESTEROS M, SUBRAMANIAN S, et al. Neural architectures for named entity recognition [J/OL]. arXiv: 1603. 01360 [cs]. 2016.
12. MA X, HOVY E. End-to-end sequence labeling via bi-directional LSTM-CNNs-CRF [J/OL]. arXiv: 1603. 01354 [cs]. 2016.
13. CHIU J P C, NICHOLS E. Named entity recognition with bidirectional LSTM-CNNs [J]. Transactions of the Association for Computational Linguistics, 2016 (4): 357-370.
14. DONG C H, ZHANG J J, ZONG C Q, et al. Character-based LSTM-CRF with radical-level features for Chinese named entity recognition [M]// Natural Language Understanding and Intelligent Applications. Cham: Springer, 2016: 239-250.
15. Peng M, Ma R, Zhang Q, et al. Simplify the Usage of Lexicon in Chinese NER [J]. ArXiv: 1908.05969v1, 2019.
16. ZHANG Y, YANG J. Chinese NER using lattice LSTM [J /OL]. arXiv: 1805. 02023 [cs], 2018.
17. Yang F, Zhang J, Liu G, et al. Five-Stroke Based CNN-BiRNN-CRF Network for Chinese Named Entity Recognition [M]. Hohhot, China: 7th CCF International Conference, 2018.
18. STRUBELL E, VERGA P, BELANGER D, et al. Fast and accurate entity recognition with iterated dilated convolutions [J/OL]. arXiv: 1702.02098 [cs], 2017.
19. Chen H, Lin Z, Ding G, et al. GRN: Gated relation network to enhance convolutional neural network for named entity recognition //Proceedings of the AAAI Conference on Artificial Intelligence. 2019, 33 (1): 6236-6243.
20. Devlin J, Chang M W, Lee K, et al. Bert: Pre-training of deep bidirectional transformers for language understanding [J]. arXiv preprint arXiv: 1810.04805, 2018.