KERMIT: Generative Insertion-Based Modeling for Sequences

06/04/2019
by   William Chan, et al.
0

We present KERMIT, a simple insertion-based approach to generative modeling for sequences and sequence pairs. KERMIT models the joint distribution and its decompositions (i.e., marginals and conditionals) using a single neural network and, unlike much prior work, does not rely on a prespecified factorization of the data distribution. During training, one can feed KERMIT paired data (x, y) to learn the joint distribution p(x, y), and optionally mix in unpaired data x or y to refine the marginals p(x) or p(y). During inference, we have access to the conditionals p(x | y) and p(y | x) in both directions. We can also sample from the joint distribution or the marginals. The model supports both serial fully autoregressive decoding and parallel partially autoregressive decoding, with the latter exhibiting an empirically logarithmic runtime. We demonstrate through experiments in machine translation, representation learning, and zero-shot cloze question answering that our unified approach is capable of matching or exceeding the performance of dedicated state-of-the-art systems across a wide range of tasks without the need for problem-specific architectural adaptation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2019

Insertion Transformer: Flexible Sequence Generation via Insertion Operations

We present the Insertion Transformer, an iterative, partially autoregres...
research
09/05/2019

FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow

Most sequence-to-sequence (seq2seq) models are autoregressive; they gene...
research
09/15/2019

Hint-Based Training for Non-Autoregressive Machine Translation

Due to the unparallelizable nature of the autoregressive factorization, ...
research
10/05/2020

Inference Strategies for Machine Translation with Conditional Masking

Conditional masked language model (CMLM) training has proven successful ...
research
03/21/2023

MAGVLT: Masked Generative Vision-and-Language Transformer

While generative modeling on multimodal image-text data has been activel...
research
02/01/2022

Examining Scaling and Transfer of Language Model Architectures for Machine Translation

Natural language understanding and generation models follow one of the t...
research
12/06/2022

Learning the joint distribution of two sequences using little or no paired data

We present a noisy channel generative model of two sequences, for exampl...

Please sign up or login with your details

Forgot password? Click here to reset