Enhanced Neural Machine Translation by Learning from Draft

10/04/2017
by   Aodong Li, et al.
0

Neural machine translation (NMT) has recently achieved impressive results. A potential problem of the existing NMT algorithm, however, is that the decoding is conducted from left to right, without considering the right context. This paper proposes an two-stage approach to solve the problem. In the first stage, a conventional attention-based NMT system is used to produce a draft translation, and in the second stage, a novel double-attention NMT system is used to refine the translation, by looking at the original input as well as the draft translation. This drafting-and-refinement can obtain the right-context information from the draft, hence producing more consistent translations. We evaluated this approach using two Chinese-English translation tasks, one with 44k pairs and 1M pairs respectively. The experiments showed that our approach achieved positive improvements over the conventional NMT system: the improvements are 2.4 and 0.9 BLEU points on the small-scale and large-scale tasks, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2019

Synchronous Bidirectional Neural Machine Translation

Existing approaches to neural machine translation (NMT) generate the tar...
research
08/07/2017

Memory-augmented Neural Machine Translation

Neural machine translation (NMT) has achieved notable success in recent ...
research
05/10/2016

Coverage Embedding Models for Neural Machine Translation

In this paper, we enhance the attention-based neural machine translation...
research
11/12/2017

Syntax-Directed Attention for Neural Machine Translation

Attention mechanism, including global attention and local attention, pla...
research
11/11/2019

Diversity by Phonetics and its Application in Neural Machine Translation

We introduce a powerful approach for Neural Machine Translation (NMT), w...
research
02/28/2022

Confidence Based Bidirectional Global Context Aware Training Framework for Neural Machine Translation

Most dominant neural machine translation (NMT) models are restricted to ...
research
07/21/2017

SGNMT -- A Flexible NMT Decoding Platform for Quick Prototyping of New Models and Search Strategies

This paper introduces SGNMT, our experimental platform for machine trans...

Please sign up or login with your details

Forgot password? Click here to reset