Time-Aware Representation Learning of Knowledge Graphs

2021 
Representation learning is a fundamental task in knowledge graph-related research and applications. Most existing approaches learn representations for entities and relations only based on static facts, where temporal information has been ignored completely. This paper aims to learn time-aware representations for entities and relations in knowledge graphs. Based on how temporal information affects the learned embeddings, we propose three assumptions and build three different models, BTS, ETS, and RTS, respectively. In these models, we build two separate embedding spaces for entities and relations, the standard translation condition is checked after projecting embedding vectors between these spaces by model-specific transformations. As to the performance, the proposed RTS model achieves state-of-the-art results in three experiments conducted on two datasets: YAGO11k and Wikidata12k, which validates the effectiveness of our model. Comparing the results of all three models, we find that relation embeddings are time-sensitive and form natural ordering, while the effects of time on entity embeddings can be safely ignored for translation-based methods. Experiments also show that our findings can be used to simplify other existing models like HyTE.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    21
    References
    0
    Citations
    NaN
    KQI
    []