Sentiment-Aware Short Text Classification Based on Convolutional Neural Network and Attention

2019 
Danmaku is an emerging socio-digital media paradigm that puts anonymous, asynchronous user-generated comments on videos with rich sentiment information. This study focuses on a new type of challenging short text classification task - sentiment-aware Danmaku classification to understand the user's opinion through collective intelligence. Currently, Convolutional neural network(CNN) and Long Short Term Memory(LSTM) network-based models have poor performance in the classification of Danmaku due to the ignoration of the word's position information in a sentence and data sparsity problem. To address these limitations, this paper proposes an Attention and CNN based sentiment-aware short text classifier to advance the state-of-the-art of short-text classification. Firstly, we define a classification criterion of Danmaku intent with six categories, considering three polarities of sentiment and two language types. Then, by introducing the attention mechanism to the Bi-LSTM based model, in which the position information is preserved during training, we realize accurate Danmaku sentiment classification and generate the sentiment embedding. Lastly, to achieve accurate sentiment-aware Danmaku intent classification, a sentiment Embedding with Channel-attention Layer is introduced to a CNN based sentence classifier using the generated Danmaku sentiment embedding. Experiment results show that the Attention and Bi-LSTM based Danmaku sentiment classifier achieves a sentiment classification accuracy of 76%. Furthermore, compared with baseline models, our proposed intent classifier delivers superior performance in classifying the six intents of Danmaku. To the best of our knowledge, this is the first study that leverages deep learning and attention mechanism for the sentiment-aware Danmaku intent classification.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    3
    Citations
    NaN
    KQI
    []