W-core Transformer Model for Chinese Word Segmentation

2021 
Chinese word segmentation is an important research content in the field of Natural Language Processing (NLP). In this paper, we combine the Transformer model to propose the Window Core (W-core) Transformer for the tasks. In this model, W-core can preprocess sentence information according to the characteristics of Chinese and incorporate features extracted by the Transformer model. Experimental results show that the W-core Transformer model can improve the effect of the original Transformer model on Chinese word segmentation. Finally, we improve the performance of W-core Transformer by increasing the number of encoder layers and oversampling.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    5
    References
    0
    Citations
    NaN
    KQI
    []