Multi-semantic granularity graphical model with ensemble BERTs and multi-staged training method for text classification

2021 
Text classification is a classic problem in Natural Language Processing (NLP). The task is to assign predefined categories to a given text sequence. In this paper we present a new Multi-semantic granularity graphical model with ensemble BERTs and multi-staged training method, which is proposed EBG model (Ensemble BERTs with GNN), two kinds of language pre-training models like BERT and XLNet using extra side information acquired via GNN (Graph Neural Networks) in raw sentences. This new approach better captures the complex semantic characteristics and utilizes the different advantages of BERTs model to deal with the various classification demands. The experimentations text classification tasks show our proposed methods can significantly improve the performance than the recent novel methods and models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    3
    References
    0
    Citations
    NaN
    KQI
    []