Infusing Future Information into Monotonic Attention Through Language Models

09/07/2021
by   Mohd Abbas Zaidi, et al.
0

Simultaneous neural machine translation(SNMT) models start emitting the target sequence before they have processed the source sequence. The recent adaptive policies for SNMT use monotonic attention to perform read/write decisions based on the partial source and target sequences. The lack of sufficient information might cause the monotonic attention to take poor read/write decisions, which in turn negatively affects the performance of the SNMT model. On the other hand, human translators make better read/write decisions since they can anticipate the immediate future words using linguistic information and domain knowledge.Motivated by human translators, in this work, we propose a framework to aid monotonic attention with an external language model to improve its decisions.We conduct experiments on the MuST-C English-German and English-French speech-to-text translation tasks to show the effectiveness of the proposed framework.The proposed SNMT method improves the quality-latency trade-off over the state-of-the-art monotonic multihead attention.

READ FULL TEXT
research
10/13/2021

Decision Attentive Regularization to Improve Simultaneous Speech Translation Systems

Simultaneous Speech-to-text Translation (SimulST) systems translate sour...
research
09/26/2019

Monotonic Multihead Attention

Simultaneous machine translation models start generating a target sequen...
research
09/04/2019

Simpler and Faster Learning of Adaptive Policies for Simultaneous Translation

Simultaneous translation is widely useful but remains challenging. Previ...
research
06/12/2019

Monotonic Infinite Lookback Attention for Simultaneous Machine Translation

Simultaneous machine translation begins to translate each source sentenc...
research
03/17/2022

Modeling Dual Read/Write Paths for Simultaneous Machine Translation

Simultaneous machine translation (SiMT) outputs translation while readin...
research
01/30/2022

Anticipation-free Training for Simultaneous Translation

Simultaneous translation (SimulMT) speeds up the translation process by ...
research
05/28/2021

Reinforcement Learning for on-line Sequence Transformation

A number of problems in the processing of sound and natural language, as...

Please sign up or login with your details

Forgot password? Click here to reset