Learning Language-Specific Layers for Multilingual Machine Translation

05/04/2023
by   Telmo Pessoa Pires, et al.
0

Multilingual Machine Translation promises to improve translation quality between non-English languages. This is advantageous for several reasons, namely lower latency (no need to translate twice), and reduced error cascades (e.g., avoiding losing gender and formality information when translating through English). On the downside, adding more languages reduces model capacity per language, which is usually countered by increasing the overall model size, making training harder and inference slower. In this work, we introduce Language-Specific Transformer Layers (LSLs), which allow us to increase model capacity, while keeping the amount of computation and the number of parameters used in the forward pass constant. The key idea is to have some layers of the encoder be source or target language-specific, while keeping the remaining layers shared. We study the best way to place these layers using a neural architecture search inspired approach, and achieve an improvement of 1.3 chrF (1.5 spBLEU) points over not using LSLs on a separate decoder architecture, and 1.9 chrF (2.2 spBLEU) on a shared decoder one.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/14/2020

Multilingual Machine Translation: Closing the Gap between Shared and Language-specific Encoder-Decoders

State-of-the-art multilingual machine translation relies on a universal ...
research
09/23/2020

Multi-Pass Transformer for Machine Translation

In contrast with previous approaches where information flows only toward...
research
05/29/2020

Training Multilingual Machine Translation by Alternately Freezing Language-Specific Encoders-Decoders

We propose a modular architecture of language-specific encoder-decoders ...
research
12/24/2020

Gender Bias in Multilingual Neural Machine Translation: The Architecture Matters

Multilingual Neural Machine Translation architectures mainly differ in t...
research
05/23/2023

Condensing Multilingual Knowledge with Lightweight Language-Specific Modules

Incorporating language-specific (LS) modules is a proven method to boost...
research
06/05/2022

Multilingual Neural Machine Translation with Deep Encoder and Multiple Shallow Decoders

Recent work in multilingual translation advances translation quality sur...
research
02/07/2023

Efficiently Upgrading Multilingual Machine Translation Models to Support More Languages

With multilingual machine translation (MMT) models continuing to grow in...

Please sign up or login with your details

Forgot password? Click here to reset