Generating Safe Diversity in NLG via Imitation Learning

04/29/2020
by   Giulio Zhou, et al.
0

Deep-learning models for language generation tasks tend to produce repetitive output. Various methods have been proposed to encourage lexical diversity during decoding, but this often comes at a cost to the perceived fluency and adequacy of the output. In this work, we propose to ameliorate this cost by using an Imitation Learning approach to explore the level of diversity that a language generation model can safely produce. Specifically, we augment the decoding process with a meta-classifier trained to distinguish which words at any given timestep will lead to high-quality output. We focus our experiments on concept-to-text generation where models are sensitive to the inclusion of irrelevant words due to the strict relation between input and output. Our analysis shows that previous methods for diversity underperform in this setting, while human evaluation suggests that our proposed method achieves a high level of diversity with minimal effect to the output's fluency and adequacy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2021

Neural Text Generation with Part-of-Speech Guided Softmax

Neural text generation models are likely to suffer from the low-diversit...
research
04/24/2022

One-Shot Domain-Adaptive Imitation Learning via Progressive Learning

Traditional deep learning-based visual imitation learning techniques req...
research
05/04/2020

Improving Adversarial Text Generation by Modeling the Distant Future

Auto-regressive text generation models usually focus on local fluency, a...
research
03/12/2020

Meta-CoTGAN: A Meta Cooperative Training Paradigm for Improving Adversarial Text Generation

Training generative models that can generate high-quality text with suff...
research
12/21/2022

Critic-Guided Decoding for Controlled Text Generation

Steering language generation towards objectives or away from undesired c...
research
11/14/2022

Evade the Trap of Mediocrity: Promoting Diversity and Novelty in Text Generation via Concentrating Attention

Recently, powerful Transformer architectures have proven superior in gen...
research
11/12/2018

Combining Learned Lyrical Structures and Vocabulary for Improved Lyric Generation

The use of language models for generating lyrics and poetry has received...

Please sign up or login with your details

Forgot password? Click here to reset