Knowledge Distillation for Multilingual Unsupervised Neural Machine Translation

04/21/2020
by   Haipeng Sun, et al.
0

Unsupervised neural machine translation (UNMT) has recently achieved remarkable results for several language pairs. However, it can only translate between a single language pair and cannot produce translation results for multiple language pairs at the same time. That is, research on multilingual UNMT has been limited. In this paper, we empirically introduce a simple method to translate between thirteen languages using a single encoder and a single decoder, making use of multilingual data to improve UNMT for all language pairs. On the basis of the empirical findings, we propose two knowledge distillation methods to further enhance multilingual UNMT performance. Our experiments on a dataset with English translated to and from twelve other languages (including three language families and six language branches) show remarkable results, surpassing strong unsupervised individual baselines while achieving promising performance between non-English language pairs in zero-shot translation scenarios and alleviating poor performance in low-resource language pairs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2021

Multilingual Neural Machine Translation:Can Linguistic Hierarchies Help?

Multilingual Neural Machine Translation (MNMT) trains a single NMT model...
research
05/03/2022

OmniKnight: Multilingual Neural Machine Translation with Language-Specific Self-Distillation

Although all-in-one-model multilingual neural machine translation (MNMT)...
research
11/21/2016

False-Friend Detection and Entity Matching via Unsupervised Transliteration

Transliterations play an important role in multilingual entity reference...
research
09/09/2021

Distributionally Robust Multilingual Machine Translation

Multilingual neural machine translation (MNMT) learns to translate multi...
research
03/28/2023

Hallucinations in Large Multilingual Translation Models

Large-scale multilingual machine translation systems have demonstrated r...
research
03/15/2022

Multilingual Mix: Example Interpolation Improves Multilingual Neural Machine Translation

Multilingual neural machine translation models are trained to maximize t...
research
08/11/2020

On Learning Language-Invariant Representations for Universal Machine Translation

The goal of universal machine translation is to learn to translate betwe...

Please sign up or login with your details

Forgot password? Click here to reset