Neural Machine Translation With Noisy Lexical Constraints

2020 
In neural machine translation, lexically constrained decoding generates translation outputs strictly including the constraints predefined by users, and it is beneficial to improve translation quality at the cost of more decoding overheads if the constraints are perfect. Unfortunately, those constraints may contain mistakes in real-world situations and incorrect constraints will undermine lexically constrained decoding. In this article, we propose a novel framework that is capable of improving the translation quality even if the constraints are noisy. The key to our framework is to treat the lexical constraints as external memories. More concretely, it encodes the constraints by a memory encoder and then leverages the memories by a memory integrator. Experiments demonstrate that our framework can not only deliver substantial BLEU gains in handling noisy constraints, but also achieve speedup in decoding. These results motivate us to apply our models to a new scenario where the constraints are generated without the help of users. Experiments show that our models can indeed improve the translation quality with the automatically generated constraints.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    50
    References
    6
    Citations
    NaN
    KQI
    []