Coreference Resolution: Are the Eliminated Spans Totally Worthless?

2021 
Up to date, various neural-based methods have been proposed for joint mention span detection and coreference resolution. However, existing studies on coreference resolution mainly depend on mention representations, while the rest spans in the text are largely ignored and directly eliminated. In this paper, we aim at investigating whether those eliminated spans are totally worthless, or to what extent they can help improve the performance of coreference resolution. To achieve this goal, we propose to refine the representation of mentions with global spans including these eliminated ones leveraged. On this basis, we further introduce an additional loss term in this work to encourage the diversity between different entity clusters. Experimental results on the document-level CoNLL-2012 Shared Task English dataset show that the eliminated spans are indeed useful and our proposed approaches show promising results in coreference resolution.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    0
    Citations
    NaN
    KQI
    []