Attention-based Joint Representation Learning Network for Short text Classification.

2020 
Deep neural networks have gained success recently in learning distributed representations for text classification. However, due to the sparsity of information in user-generated comments, existing approaches still suffer from the problem of exploiting the semantic information by halves to classify current sentence. In this paper, we propose a novel attention-based joint representation learning network (AJRLN). The proposed model provides two attention-based subnets to extract different attentive features of the sentence embedding. Then, these features are combined by the representation combination layer to get the joint representation of the whole sentence for classification. We conduct extensive experiments on SST, TREC and SUBJ datasets. The experimental results demonstrate that our model achieved comparable or better performance than other state-of-the-art methods.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    0
    Citations
    NaN
    KQI
    []