Ontological Concept Structure Aware Knowledge Transfer for Inductive Knowledge Graph Embedding
2021
Conventional knowledge graph embedding methods mainly assume that all entities at reasoning stage are available in the original training graph. But in real-world application scenarios, newly emerged entities are always inevitable, which results in the severe problem of out-of-knowledge-graph entities. Existing efforts on this issue mostly either utilize additional resources, e.g., entity descriptions, or simply aggregate in-knowledge-graph neighbors to embed these new entities inductively. However, high-quality additional resources are usually hard to obtain and existing neighbors of new entities may be too sparse to provide enough information for modeling these entities. Meanwhile, they may fail to integrate the rich information of ontological concepts, which provide a general figure of instance entities and usually remain unchanged in knowledge graph. To this end, we propose a novel inductive framework namely CatE to solve the sparsity problem with the enhancement from ontological concepts. Specifically, we first adopt the transformer encoder to model the complex contextual structure of the ontological concepts. Then, we further develop a template refinement strategy for generating the target entity embedding, where the concept embedding is used to form a basic skeleton of the target entity and the individual characteristics of the entity will be enriched by its existing neighbors. Finally, extensive experiments on public datasets demonstrate the effectiveness of our proposed model compared with state-of-the-art baseline methods.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
22
References
0
Citations
NaN
KQI