Chinese Named Entity Recognition with Character-Level BLSTM and Soft Attention Model

Journal of Beijing Institute of Technology ›› 2020, Vol. 29 ›› Issue (1) : 60 -71.

PDF (458KB)
Journal of Beijing Institute of Technology ›› 2020, Vol. 29 ›› Issue (1) : 60 -71. DOI: 10.15918/j.jbit1004-0579.18161

Chinese Named Entity Recognition with Character-Level BLSTM and Soft Attention Model

Author information +
History +
PDF (458KB)

Abstract

Unlike named entity recognition (NER) for English, the absence of word boundaries reduces the final accuracy for Chinese NER. To avoid accumulated error introduced by word segmentation, a deep model extracting character-level features is carefully built and becomes a basis for a new Chinese NER method, which is proposed in this paper. This method converts the raw text to a character vector sequence, extracts global text features with a bidirectional long short-term memory and extracts local text features with a soft attention model. A linear chain conditional random field is also used to label all the characters with the help of the global and local text features. Experiments based on the Microsoft Research Asia (MSRA) dataset are designed and implemented. Results show that the proposed method has good performance compared to other methods, which proves that the global and local text features extracted have a positive influence on Chinese NER. For more variety in the test domains, a resume dataset from Sina Finance is also used to prove the effectiveness of the proposed method.

Keywords

Chinese / named entity recognition (NER) / character-level / bidirectional long short-term memory / soft attention model

Cite this article

Download citation ▾
null. Chinese Named Entity Recognition with Character-Level BLSTM and Soft Attention Model. Journal of Beijing Institute of Technology, 2020, 29(1): 60-71 DOI:10.15918/j.jbit1004-0579.18161

登录浏览全文

4963

注册一个新账户 忘记密码

References

AI Summary AI Mindmap
PDF (458KB)

445

Accesses

0

Citation

Detail

Sections
Recommended

AI思维导图

/