Incorporating Context-Relevant Concepts into Convolutional Neural Networks for Short Text Classification

2019 
Abstract Text classification is an important task in natural language processing. Previous text classification models do not perform well on short texts due to the data sparsity problem. In order to solve this problem, recent research extracts concepts of words to enrich text representation. However, this approach might bring general concepts, which might not be helpful in discriminating categories in text classification. Furthermore, it might bring noise into text representation and lead to performance degradation. To tackle these problems, we propose a neural network called DE-CNN, which can incorporate context-relevant concepts into a convolutional neural network for short text classification. Our model firstly utilizes two layers to extract concepts and context respectively and then employs an attention layer to extract context-relevant concepts. Then the concepts are incorporated into text representation for short text classification. The experimental results on three text classification tasks show that our proposed model outperforms compared state-of-the-art models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    49
    References
    18
    Citations
    NaN
    KQI
    []