Deep Recurrent Generative Decoder for Abstractive Text Summarization

08/02/2017
by   Piji Li, et al.
0

We propose a new framework for abstractive text summarization based on a sequence-to-sequence oriented encoder-decoder model equipped with a deep recurrent generative decoder (DRGN). Latent structure information implied in the target summaries is learned based on a recurrent latent random model for improving the summarization quality. Neural variational inference is employed to address the intractable posterior inference for the recurrent latent variables. Abstractive summaries are generated based on both the generative latent variables and the discriminative deterministic states. Extensive experiments on some benchmark datasets in different languages show that DRGN achieves improvements over the state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2016

Abstractive Text Summarization Using Sequence-to-Sequence RNNs and Beyond

In this work, we model abstractive text summarization using Attentional ...
research
04/20/2020

On the Encoder-Decoder Incompatibility in Variational Text Modeling and Beyond

Variational autoencoders (VAEs) combine latent variables with amortized ...
research
05/23/2018

Amortized Context Vector Inference for Sequence-to-Sequence Networks

Neural attention (NA) is an effective mechanism for inferring complex st...
research
04/05/2023

To Asymmetry and Beyond: Structured Pruning of Sequence to Sequence Models for Improved Inference Efficiency

Sequence-to-sequence language models can be used to produce abstractive ...
research
11/28/2016

Joint Copying and Restricted Generation for Paraphrase

Many natural language generation tasks, such as abstractive summarizatio...
research
02/04/2019

Re-examination of the Role of Latent Variables in Sequence Modeling

With latent variables, stochastic recurrent models have achieved state-o...
research
10/11/2017

Sequence stacking using dual encoder Seq2Seq recurrent networks

A widely studied non-polynomial (NP) hard problem lies in finding a rout...

Please sign up or login with your details

Forgot password? Click here to reset