Modeling Global Semantics for Question Answering over Knowledge Bases

2021 
Query graph as a junction of semantic parsing in question answering over knowledge bases (KBQA) connects questions and logical queries. Though query graph consists of rich information such as structure, relation, etc., the current KBQA's models mainly utilize limited relation information in a naive way. It is not easy to learn the representation of a query graph with that information due to the heterogeneity of the query graph and intricate correlation of relations. In this paper, we propose a Global Semantic-based Message Passing (GSMP) model to model the global semantics of a query graph from its structure and relation information. In GSMP, we present a recurrent-based relational graph convolutional network (RGCN) to capture heterogeneous query graphs where the recurrent unit improves the capability of RGCN in processing small-scale query graphs. Moreover, we present a contextual-based method to remove ambiguity caused by intricate correlations where the contextual adjacency of relations optimizes relation representation. Finally, we present a nonlinear gate-based encoder to learning the representation of questions' syntactic tree, as the structure information of questions, for better matching the global semantics of query graphs. Experiments evaluated on benchmarks show that our model outperforms off-the-shelf models.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    38
    References
    0
    Citations
    NaN
    KQI
    []