Improving Sentence Representations with Local and Global Attention for Classification

2019 
Representation learning is a key issue for text classification tasks. Few existing representation models are able to learn sufficient text information, including local semantic information and global structure information. This paper focuses on how to generate better semantic and structure representations to obtain better sentence representation with them. In detail, we propose a hierarchical local and global attention network to learn sentence representation automatically. We generate semantic and structure representations respectively with local attention. Global attention is used to get the final representation. The final representation obtained is used for training and prediction. Experimental results show that our method achieves ideal results in several text classification tasks, including sentiment analysis, subjectivity classification and question type classification. The specific accuracies are 81.6%(MR), 93.6%(SUBJ), 49.4%(SST-5) and 95.6%(TREC).
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    34
    References
    1
    Citations
    NaN
    KQI
    []