Combining Knowledge with Attention Neural Networks for Short Text Classification

2021 
Text classification has emerged as an important research area over the last few years in natural language processing (NLP). Different from formal documents and paragraphs, short texts are more ambiguous, due to the lack of contextual information and the data sparsity problem, which poses a great challenge to traditional classification methods. In order to solve this problem, conceptual knowledge is introduced to enrich the information of short texts. However, this method assumes that all knowledge is equally important which is not conducive to distinguishing short texts classification. In addition, it also brings knowledge noise to the text, and causes the degradation of classification performance. To measure the importance of concepts to short texts, the paper introduces the attention mechanism. Text-Relevant-Concept (T-RC) is utilized to resolve the ambiguity of concepts and choose the most appropriate meaning to align short text. We employ Concept-Relevant-Concept (C-RC) to handle conceptual hierarchy and the relative importance of the concept. We investigate a model combining Knowledge with Attention Neural Networks (CK-ANN). Experiments show that CK-ANN outperforms state-of-the-art methods on text classification benchmarks, which proves the effectiveness of our method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []