A Hierarchical Encoding-Decoding Scheme for Abstractive Multi-document Summarization

by   Chenhui Shen, et al.

Pre-trained language models (PLMs) have accomplished impressive achievements in abstractive single-document summarization (SDS). However, such benefits may not be readily extended to muti-document summarization (MDS), where the interactions among documents are more complex. Previous works either design new architectures or new pre-training objectives for MDS, or apply PLMs to MDS without considering the complex document interactions. While the former does not make full use of previous pre-training efforts and may not generalize well across multiple domains, the latter cannot fully attend to the intricate relationships unique to MDS tasks. In this paper, we enforce hierarchy on both the encoder and decoder and seek to make better use of a PLM to facilitate multi-document interactions for the MDS task. We test our design on 10 MDS datasets across a wide range of domains. Extensive experiments show that our proposed method can achieve consistent improvements on all these datasets, outperforming the previous best models, and even achieving better or competitive results as compared to some models with additional MDS pre-training or larger model parameters.


Multi-News: a Large-Scale Multi-Document Summarization Dataset and Abstractive Hierarchical Model

Automatic generation of summaries from multiple news articles is a valua...

Multi-document Summarization: A Comparative Evaluation

This paper is aimed at evaluating state-of-the-art models for Multi-docu...

A Multi-Document Coverage Reward for RELAXed Multi-Document Summarization

Multi-document summarization (MDS) has made significant progress in rece...

Multi-document Summarization with Maximal Marginal Relevance-guided Reinforcement Learning

While neural sequence learning methods have made significant progress in...

Extractive Multi Document Summarization using Dynamical Measurements of Complex Networks

Due to the large amount of textual information available on Internet, it...

CO2Sum:Contrastive Learning for Factual-Consistent Abstractive Summarization

Generating factual-consistent summaries is a challenging task for abstra...

Please sign up or login with your details

Forgot password? Click here to reset