Entity Linking Facing Incomplete Knowledge Base
2018
Entity linking, bridging text and knowledge base, is a fundamental task in the field of information extraction. Most existing approaches highly depend on the structural features and statistics in the target knowledge base. Compared with raw text, they provide more discriminative information and make the task easier. However, in many closed domains, structural features and statistics are rarely available and the target knowledge base may be as simple and sparse as a series of separate entity records only with description. Therefore, few algorithms could work well on the incomplete knowledge base. In this paper, we propose a novel neural approach which only requires minimal text information from the knowledge base. To extract features from text effectively, we employ the co-attention mechanism to emphasize discriminative words and weaken noise. Compared with existing “black box” neural approaches, co-attention mechanism also brings better interpretability to our model. We conduct experiments on the AIDA-CoNLL benchmark and evaluate the performance with accuracy. Results show that our model achieves 82.3% in accuracy and outperforms the baseline by 1.1%.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
25
References
1
Citations
NaN
KQI