Domain Specific word Embedding Matrix for Training Neural Networks

2019 
The text represents one of the most widespread sequential models and as such is well suited to the application of deep learning models from sequential data. Deep learning through natural language processing is pattern recognition, applied to words, sentences, and paragraphs. This study describes the process of creating a pre-trained word embeddings matrix and its subsequent use in various neural network models for the purposes of domain-specific texts classification. Embedding words is one of the popular ways to associate vectors with words. Creating a word embedding matrix maps imply well semantic relationship between words, which can vary from task to task.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []