Sometimes We Want Translationese

by   Prasanna Parthasarathi, et al.

Rapid progress in Neural Machine Translation (NMT) systems over the last few years has been driven primarily towards improving translation quality, and as a secondary focus, improved robustness to input perturbations (e.g. spelling and grammatical mistakes). While performance and robustness are important objectives, by over-focusing on these, we risk overlooking other important properties. In this paper, we draw attention to the fact that for some applications, faithfulness to the original (input) text is important to preserve, even if it means introducing unusual language patterns in the (output) translation. We propose a simple, novel way to quantify whether an NMT system exhibits robustness and faithfulness, focusing on the case of word-order perturbations. We explore a suite of functions to perturb the word order of source sentences without deleting or injecting tokens, and measure the effects on the target side in terms of both robustness and faithfulness. Across several experimental conditions, we observe a strong tendency towards robustness rather than faithfulness. These results allow us to better understand the trade-off between faithfulness and robustness in NMT, and opens up the possibility of developing systems where users have more autonomy and control in selecting which property is best suited for their use case.


page 7

page 16


Evaluating Robustness to Input Perturbations for Neural Machine Translation

Neural Machine Translation (NMT) models are sensitive to small perturbat...

SMT vs NMT: A Comparison over Hindi & Bengali Simple Sentences

In the present article, we identified the qualitative differences betwee...

Towards Understanding Neural Machine Translation with Word Importance

Although neural machine translation (NMT) has advanced the state-of-the-...

Novel Applications of Factored Neural Machine Translation

In this work, we explore the usefulness of target factors in neural mach...

Reducing Hallucinations in Neural Machine Translation with Feature Attribution

Neural conditional language generation models achieve the state-of-the-a...

Addressing the Vulnerability of NMT in Input Perturbations

Neural Machine Translation (NMT) has achieved significant breakthrough i...

Please sign up or login with your details

Forgot password? Click here to reset