Semantics perception and refinement network for aspect-based sentiment analysis

2021 
Abstract With the proliferation of user-generated content on the web, sentiment analysis of text has become a current research hotspot. Aspect-based sentiment analysis (ABSA) is a type of fine-grained sentiment analysis that aims to identify the sentiment polarities of an aspect given in a context sentence. To date, most ABSA methods are based on recurrent neural networks (RNNs) and attention mechanism. However, RNNs suffer from losing long-distance dependency as the sentence length increases. In contrast, self-attention (SA) concentrates on extracting the dependencies among words regardless of the distance. However, similar to RNNs, SA is not sensitive enough to perceive the local semantics information for the ABSA task. In addition, the attention mechanism employed in ABSA may introduce some noise, which is not conducive to capturing important sentiment expressions. To address these issues, in this paper, we propose a semantics perception and refinement network (SPRN) for sentiment analysis based on aspects. In the SPRN, a novel structure named the dual gated multichannel convolution (DGMCC) is utilized to acquire the aspect-related semantics features of a sentence. Specifically, based on the sentence’s global dependency obtained by the SA-based mechanism as the context information and aspect information over the sentence, a multichannel convolution (MCC) is designed with multiple semantic spaces to extract informative local semantic features. Moreover, a dual refinement gate (DRG) is proposed to strengthen the interaction between the aspect and context at a fine granularity and enhance the noise filter. To verify the effect of the SPRN, we conduct abundant experiments on 5 benchmark datasets with pretrained BERT and GloVe embeddings. The experimental results demonstrate that the SPRN performs better than 14 state-of-the-art ABSA methods in terms of various evaluation indices such as accuracy, marco-F1 and AUROC.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    59
    References
    3
    Citations
    NaN
    KQI
    []