Knowledge graph question answering based on TE-BiLTM and knowledge graph embedding

2021 
Abstract: Knowledge graph question answering (KGQA) aims to use facts in the knowledge graph to answer natural language questions. Relation extraction, as one of the sub-tasks of the KGQA, is an important and difficult problem in the KGQA. To improve the accuracy of relation extraction in KGQA, in this paper, we propose a new deep neural network model called Transformer Encoder-BiLSTM (TE-BiLSTM). We give the detailed design of our method and our experimental results demonstrate that our approach can not only achieve better results in relation extraction, but can also outperform the state-of-the-art approaches in the KGQA.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    0
    Citations
    NaN
    KQI
    []