The NiuTrans Machine Translation Systems for WMT20
2020
This paper describes NiuTrans neural machine translation systems of the WMT20 news translation tasks. We participated in Japanese English, English->Chinese, Inuktitut->English and Tamil->English total five tasks and rank first in Japanese English both sides. We mainly utilized iterative back-translation, different depth and widen model architectures, iterative knowledge distillation and iterative fine-tuning. And we find that adequately widened and deepened the model simultaneously, the performance will significantly improve. Also, iterative fine-tuning strategy we implemented is effective during adapting domain. For Inuktitut->English and Tamil->English tasks, we built multilingual models separately and employed pretraining word embedding to obtain better performance.
Keywords:
- Correction
- Source
- Cite
- Save
- Machine Reading By IdeaReader
23
References
6
Citations
NaN
KQI