Chinese Named Entity Recognition Based on B-LSTM Neural Network with Additional Features

2017 
Traditional methods for named entity recognition (NER) require heavy feature engineering to achieve high performance. We propose a novel neural network architecture for NER that detects word features automatically without feature engineering. Our approach uses word embedding as input, feeds them into a bidirectional long short-term memory (B-LSTM) for modeling the context within a sentence, and outputs the NER results. This study extends the neural network language model through B-LSTM, which outperforms other deep neural network models in NER tasks. Experimental results show that the B-LSTM with word embedding trained on a large corpus achieves the highest F-score of 0.9247, thus outperforming state-of-the-art methods that are based on feature engineering.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    28
    References
    6
    Citations
    NaN
    KQI
    []