MetaMT,a MetaLearning Method Leveraging Multiple Domain Data for Low Resource Machine Translation

12/11/2019
by   Rumeng Li, et al.
0

Manipulating training data leads to robust neural models for MT.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

Simulated Multiple Reference Training Improves Low-Resource Machine Translation

Many valid translations exist for a given sentence, and yet machine tran...
research
10/30/2017

Machine Translation of Low-Resource Spoken Dialects: Strategies for Normalizing Swiss German

The goal of this work is to design a machine translation system for a lo...
research
03/20/2022

Small Batch Sizes Improve Training of Low-Resource Neural MT

We study the role of an essential hyper-parameter that governs the train...
research
10/13/2021

Bandits Don't Follow Rules: Balancing Multi-Facet Machine Translation with Multi-Armed Bandits

Training data for machine translation (MT) is often sourced from a multi...
research
12/22/2014

Pragmatic Neural Language Modelling in Machine Translation

This paper presents an in-depth investigation on integrating neural lang...
research
06/11/2018

Distributed Evaluations: Ending Neural Point Metrics

With the rise of neural models across the field of information retrieval...
research
06/11/2023

Neural Machine Translation for the Indigenous Languages of the Americas: An Introduction

Neural models have drastically advanced state of the art for machine tra...

Please sign up or login with your details

Forgot password? Click here to reset