HGEN: Learning Hierarchical Heterogeneous Graph Encoding for Math Word Problem Solving

2022 
Designing algorithms to solve math word problems (MWPs) is an important research topic in natural language processing and smart education domains. The task of solving MWPs involves transforming math problem texts into math equations. Although recent Graph2Tree-based models, which adopt homogeneous graph encoders to learn quantity representations, have obtained very promising results in generating math equations, they do not consider the heterogeneous issue and the long-distance dependencies of heterogeneous nodes. In this paper, we propose a novel hierarchical heterogeneous graph encoding called HGEN for MWPs. Specifically, HGEN first introduces a heterogeneous graph consisting of a node-level attention layer and a type-aware attention layer to learn the heterogeneous node embedding. HGEN then captures the long-distance dependent information by propagating the multi-hop nodes in a hierarchical manner. We conduct extensive experiments on two popular MWP datasets. Our empirical results show that HGEN significantly outperforms the state-of-the-art Graph2Tree-based models in the literature.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    65
    References
    0
    Citations
    NaN
    KQI
    []