Learning attention-based representations from multiple patterns for relation prediction in knowledge graphs

2022 
Knowledge bases, and their representations in the form of knowledge graphs (KGs), are naturally incomplete. Since scientific and industrial applications have extensively adopted them, there is a high demand for solutions that complete their information. Several recent works tackle this challenge by learning embeddings for entities and relations, then employing them to predict new relations among the entities. Despite their aggrandizement, most of those methods focus only on the local neighbors of a relation to learn the embeddings. As a result, they may fail to capture the KGs’ context information by neglecting long-term dependencies and the propagation of entities’ semantics. In this manuscript, we propose ÆMP (ttention-based mbeddings from ultiple atterns), a novel model for learning contextualized representations by: acquiring entities context information through an attention-enhanced message-passing scheme, which captures the entities local semantics while focusing on different aspects of their neighborhood; and
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []