Causal Narrative Comprehension: A New Perspective for Emotion Cause Extraction

2022 
Emotion Cause Extraction (ECE) aims to reveal the cause clauses behind a given emotion expressed in a text, which has become an emerging topic in broad research communities, such as affective computing and natural language processing. Despite the fact that current methods about the ECE task have made great progress in text semantic understanding from lexicon- and sentence-level, they always ignore the certain causal narratives of emotion text. Significantly, these causal narratives are presented in the form of semantic structure and highly helpful for structure-level emotion cause understanding. Nevertheless, causal narrative is just an abstract narratological concept and its involving semantics is quite different from the common sequential information. Thus, how to properly model and utilize such particular narrative information to boost the ECE performance still remains an unresolved challenge. To this end, in this paper, we propose a novel Causal Narrative Comprehension Model (CNCM) for emotion cause extraction, which learns and leverages causal narrative information smartly to address the above problem. Specifically, we develop a Narrative-aware Causal Association (NCA) unit, which mines the narrative cue about emotional results and uses the semantic correlation between causes and results to model causal narratives of documents. Besides, we design a Result-aware Emotion Attention (REA) unit to make full use of the known result of causal narrative for multiple understanding about emotional causal associations. Through the ingenious combination and collaborative utilization of these two units, we could better identify the emotion cause in the text with causal narrative comprehension. Extensive experiments on the public English and Chinese benchmark datasets of ECE task have validated the effectiveness of CNCM with significant margin by comparing with the state-of-the-art baselines, which demonstrates the potential of narrative information in long text understanding.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    84
    References
    0
    Citations
    NaN
    KQI
    []