Self-Attention-Based BiLSTM Model for Short Text Fine-grained Sentiment Classification

2019 
Fine-grained sentiment polarity classification for short texts has been an important and challenging task in natural language processing until these years. The short texts may contain multiple aspect-terms, opinion terms expressing different sentiments for different aspect-terms. The polarity of the whole sentence is highly correlated with the aspect-terms and opinion terms. Besides, there are two challenges, which are how to effectively use the contextual information and the semantic features, and how to model the correlations between aspect-terms and context words including opinion terms. To solve these problems, a Self-Attention-Based BiLSTM model with aspect-term information is proposed for the fine-grained sentiment polarity classification for short texts. The proposed model can effectively use contextual information and semantic features, and especially model the correlations between aspect-terms and context words. The model mainly consists of a word-encode layer, a BiLSTM layer, a self-attention layer and a softmax layer. Among them, the BiLSTM layer sums up the information from two opposite directions of a sentence through two independent LSTMs. The self-attention layer captures the more important parts of a sentence when different aspect-terms are input. Between the BiLSTM layer and the self-attention layer, the hidden vector and the aspect-term vector are fused by adding, which reduces the computational complexity caused by the vector splicing directly. The experiments on public Restaurant and Laptop corpus from the SemEval 2014 Task 4, and Twitter corpus from the ACL 14. The Friedman and Nemenyi tests are used in the comparison study. Compared with existing methods, experimental results demonstrate that the proposed model is feasible and efficient.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    12
    References
    17
    Citations
    NaN
    KQI
    []