Sentence-level sentiment analysis via BERT and BiGRU

2019 
Sentiment analysis is a significant task in nature language processing (NLP). Acquiring high quality word representations is a key point in the task. Specially we find that the same word has different meaning in different sentence, which should be recognized by computer. This idea cannot be done well by traditional way of word embeddings. In this paper, we propose a BERT(Bidirectional Encoder Representation from Transformers) + BiGRU (Bidirectional Gated Recurrent Unit) model which first put words into vector via BERT model, from which we can gain the contextualized embeddings, then perform the sentiment analysis by BiGRU. Experimental results prove that compared with various of different methods, our model has the best performing.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    2
    Citations
    NaN
    KQI
    []