Logic Enhanced Commonsense Inference with Chain Transformer

2020 
We study the commonsense inference task that aims to reason and generate the causes and effects of a given event. Existing neural methods focus more on understanding and representing the event itself, but pay little attention to the relations between different commonsense dimensions (e.g. causes or effects) of the event, making the generated results logically inconsistent and unreasonable. To alleviate this issue, we propose Chain Transformer, a logic enhanced commonsense inference model that combines both direct and indirect inferences to construct a logical chain so as to reason in a more logically consistent way. First, we apply a self-attention based encoder to represent and encode the given event. Then a chain of decoders is implemented to reason and generate for different dimensions following the logical chain, where an attention module is designed to link different decoders and to make each decoder attend to the previous reasoned inferences. Experiments on two real-world datasets show that Chain Transformer outperforms previous methods on both automatic and human evaluation, and demonstrate that Chain Transformer can generate more reasonable and logically consistent inference results.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    11
    References
    0
    Citations
    NaN
    KQI
    []