Chinese Word Embeddings with Subwords

2018 
Word embeddings are very useful in variety of natural language processing tasks. Recently, more researches have focused on learning word embeddings with morphological knowledge of words, such as character and subword information. In this paper, we present a new method to use subwords and characters together to enhance word embeddings (SWE). In our model, we use subword and character vectors to modify the direction of word vectors, instead of adding them directly. We evaluate SWE on both word similarity task and analogical reasoning task. The results demonstrate that our model can learn better Chinese word embeddings than other baseline models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    26
    References
    0
    Citations
    NaN
    KQI
    []