HKA: A Hierarchical Knowledge Attention Mechanism for Multi-Turn Dialogue System
2020
Generating informative responses by incorporating external knowledge into dialogue system attracts more and more attention. Most previous works facilitate single-turn dialogue system on generating such responses. However, few works focus on incorporating knowledge for multi-turn system, since the hierarchy of knowledge, from the words and utterances in context, is ignored. Motivated by this, we propose a novel hierarchical knowledge attention (HKA) mechanism for open-domain multi-turn dialogue system in this paper, which utilizes both word and utterance level attention jointly. Experiments demonstrate that the proposed HKA can incorporate more appropriate knowledge and make the state-of-the-art models generate more informative responses. Further analysis shows that our HKA can improve the model’s ability of dialogue state management, especially when the number of dialogue turns is large.
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
17
References
3
Citations
NaN
KQI