Enhancing Unsupervised Pretraining with External Knowledge for Natural Language Inference

2019 
Unsupervised pretraining such as BERT (Bidirectional Encoder Representations from Transformers) [2] represents the most recent advance on learning representation for natural language, which has helped achieve leading performance on many natural language processing problems. Although BERT can leverage large corpora, we assume it cannot learn all needed semantics and knowledge for natural language inference (NLI). In this paper, we leverage human-authorized external knowledge to further improve BERT, and our results show that BERT, the current state-of-the-art pretraining framework, can benefit from external knowledge.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    9
    References
    12
    Citations
    NaN
    KQI
    []