Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation

06/10/2018
by   Fahim Dalvi, et al.
0

We address the problem of simultaneous translation by modifying the Neural MT decoder to operate with dynamically built encoder and attention. We propose a tunable agent which decides the best segmentation strategy for a user-defined BLEU loss and Average Proportion (AP) constraint. Our agent outperforms previously proposed Wait-if-diff and Wait-if-worse agents (Cho and Esipova, 2016) on BLEU with a lower latency. Secondly we proposed data-driven changes to Neural MT training to better match the incremental decoding framework.

READ FULL TEXT
research
11/10/2020

Simultaneous Speech-to-Speech Translation System with Neural Incremental ASR, MT, and TTS

This paper presents a newly developed, simultaneous neural speech-to-spe...
research
06/07/2016

Can neural machine translation do simultaneous translation?

We investigate the potential of attention-based neural machine translati...
research
10/03/2016

Learning to Translate in Real-time with Neural Machine Translation

Translating in real-time, a.k.a. simultaneous translation, outputs trans...
research
03/04/2022

From Simultaneous to Streaming Machine Translation by Leveraging Streaming History

Simultaneous Machine Translation is the task of incrementally translatin...
research
06/07/2016

Memory-enhanced Decoder for Neural Machine Translation

We propose to enhance the RNN decoder in a neural machine translator (NM...
research
06/03/2019

Training Neural Machine Translation To Apply Terminology Constraints

This paper proposes a novel method to inject custom terminology into neu...
research
05/17/2023

Accelerating Transformer Inference for Translation via Parallel Decoding

Autoregressive decoding limits the efficiency of transformers for Machin...

Please sign up or login with your details

Forgot password? Click here to reset