Single Model Ensemble for Subword Regularized Models in Low-Resource Machine Translation

03/25/2022
by   Sho Takase, et al.
0

Subword regularizations use multiple subword segmentations during training to improve the robustness of neural machine translation models. In previous subword regularizations, we use multiple segmentations in the training process but use only one segmentation in the inference. In this study, we propose an inference strategy to address this discrepancy. The proposed strategy approximates the marginalized likelihood by using multiple segmentations including the most plausible segmentation and several sampled segmentations. Because the proposed strategy aggregates predictions from several segmentations, we can regard it as a single model ensemble that does not require any additional cost for training. Experimental results show that the proposed strategy improves the performance of models trained with subword regularization in low-resource machine translation tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/29/2020

Adversarial Subword Regularization for Robust Neural Machine Translation

Exposing diverse subword segmentations to neural machine translation (NM...
research
11/05/2019

Data Diversification: An Elegant Strategy For Neural Machine Translation

A common approach to improve neural machine translation is to invent new...
research
03/20/2021

The Effectiveness of Morphology-aware Segmentation in Low-Resource Neural Machine Translation

This paper evaluates the performance of several modern subword segmentat...
research
05/29/2018

Distilling Knowledge for Search-based Structured Prediction

Many natural language processing tasks can be modeled into structured pr...
research
04/30/2020

Language Model Prior for Low-Resource Neural Machine Translation

The scarcity of large parallel corpora is an important obstacle for neur...
research
03/20/2022

Small Batch Sizes Improve Training of Low-Resource Neural MT

We study the role of an essential hyper-parameter that governs the train...
research
05/23/2023

One-stop Training of Multiple Capacity Models for Multilingual Machine Translation

Training models with varying capacities can be advantageous for deployin...

Please sign up or login with your details

Forgot password? Click here to reset