TW-TGNN: Two Windows Graph-Based Model for Text Classification
2021
Text classification is the most fundamental and classical task in the natural language processing (NLP). Recently, graph neural network (GNN) methods, especially the graph-based model, have been applied for solving this issue because of their superior capacity of capturing the global co-occurrence information. However, some existing GNN-based methods adopt a corpus-level graph structure which causes a high memory consumption. In addition, these methods have not taken account of the global co-occurrence information and local semantic information at the same time. To address these problems, we propose a new GNN-based model, namely two windows text gnn model (TW-TGNN), for text classification. More specifically, we build text-level graph for each text with a local sliding window and a dynamic global window. For one thing, the local window sliding inside the text will acquire enough local semantic features. For another, the dynamic global window sliding betweent texts can generate dynamic shared weight matrix, which overcomes the limitation of the fixed corpus level co-occurrence and provides richer dynamic global information. Our experimental results on four benchmark datasets illustrate the improvement of the proposed method over state-of-the-art text classification methods. Moreover, we find that our method captures adequate global information for the short text which is beneficial for overcoming the insufficient contextual information in the process of the short text classification.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
30
References
0
Citations
NaN
KQI