Levenshtein Transformer

05/27/2019
by   Jiatao Gu, et al.
0

Modern neural sequence generation models are built to either generate tokens step-by-step from scratch or (iteratively) modify a sequence of tokens bounded by a fixed length. In this work, we develop Levenshtein Transformer, a new partially autoregressive model devised for more flexible and amenable sequence generation. Unlike previous approaches, the atomic operations of our model are insertion and deletion. The combination of them facilitates not only generation but also sequence refinement allowing dynamic length changes. We also propose a set of new training techniques dedicated at them, effectively exploiting one as the other's learning signal thanks to their complementary nature. Experiments applying the proposed model achieve comparable performance but much-improved efficiency on both generation (e.g. machine translation, text summarization) and refinement tasks (e.g. automatic post-editing). We further confirm the flexibility of our model by showing a Levenshtein Transformer trained by machine translation can straightforwardly be used for automatic post-editing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2019

Insertion Transformer: Flexible Sequence Generation via Insertion Operations

We present the Insertion Transformer, an iterative, partially autoregres...
research
03/24/2020

Felix: Flexible Text Editing Through Tagging and Insertion

We present Felix — a flexible text-editing approach for generation, desi...
research
11/13/2017

QuickEdit: Editing Text & Translations via Simple Delete Actions

We propose a framework for computer-assisted text editing. It applies to...
research
01/15/2020

Parallel Machine Translation with Disentangled Context Transformer

State-of-the-art neural machine translation models generate a translatio...
research
09/15/2021

Sequence Length is a Domain: Length-based Overfitting in Transformer Models

Transformer-based sequence-to-sequence architectures, while achieving st...
research
01/15/2020

Insertion-Deletion Transformer

We propose the Insertion-Deletion Transformer, a novel transformer-based...
research
08/14/2023

Generating Individual Trajectories Using GPT-2 Trained from Scratch on Encoded Spatiotemporal Data

Following Mizuno, Fujimoto, and Ishikawa's research (Front. Phys. 2022),...

Please sign up or login with your details

Forgot password? Click here to reset