Aspect term extraction for opinion mining using a Hierarchical Self-Attention Network

2021 
Abstract Aspect identification is one of the important sub-tasks in opinion mining and this task can be considered as a token-level sequencing problem. Most recent approaches employ BERT based network to identify the aspect term, which is often complex, consumes a lot of memory, and needs more training time. In this paper, we propose a novel Hierarchical Self-Attention Network (HSAN) which performs well, needs lesser memory and training time. HSAN hierarchically applies a self-attention mechanism to first capture the importance of each word in the context of the overall meaning of the sentence and then it explores the internal dependency of the words in the same sentence to identify interdependent collocated words. A fusion of these two-attention mechanisms helps HSAN to predict multiple aspect terms effectively in the given sentence along with multi-token aspect terms. Our proposed network uses word embeddings, which is a combination of general-purpose embeddings and domain-specific embeddings. We evaluate the performance of HSAN on SemEval-2014 datasets, experimental results demonstrate the efficiency and effectiveness of our model.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    36
    References
    0
    Citations
    NaN
    KQI
    []