Clinical Text Classification with Word Embedding Features vs. Bag-of-Words Features

2018 
Word embedding motivated by deep learning have shown promising results over traditional bag-of-words features for natural language processing. When trained on large text corpora, word embedding methods such as word2vec and doc2vec methods have the advantage of learning from unlabeled data and reduce the dimension of the feature space. In this study, we experimented with word2vec and doc2vec features for a set of clinical text classification tasks and compared the results with using the traditional bag-of-words (BOW) features. The study showed that the word2vec features performed better than the BOW-1-gram features. However, when 2-grams were added to BOW, comparison results were mixed.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    22
    References
    7
    Citations
    NaN
    KQI
    []