Equipping recurrent neural network with CNN-style attention mechanisms for sentiment analysis of network reviews

2019 
Abstract Deep learning algorithms have achieved remarkable results in natural language processing (NLP) and computer vision. Especially, deep learning methods such as convolution and recurrent neural networks have shown remarkable performance in text analytic task. Moreover, from the attention mechanism perspective convolutional neural network (CNN) is applied less than recurrent neural network (RNN). Because RNN can learn long-term dependencies and gives better results than CNN. But CNN has its own advantage, can extract high-level features invariant to the local translation by using its local fix size context at the input level. Thus, in this paper, we proposed a new model based on RNN with CNN-style self-attention mechanism by using the merits of both architectures together in one model. In the proposed model, first, CNN learns the high-level representation of words at the input level. Second, we used self-attention mechanism to get the attention of the model on the features which contribute much in the prediction task by calculating the attentive context vectors over hidden states representation generated from CNN. Finally, hidden state representations from CNN with attentive context vectors are commonly used at the RNN to process them sequentially. To validate the model we experiment on three benchmark datasets i.e. Movie review, Stanford sentiment treebank1, and treebank2. Experiment results and their analysis demonstrate the effectiveness of the proposed model.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    73
    References
    5
    Citations
    NaN
    KQI
    []