Investigating Pretrained Language Models for Graph-to-Text Generation

07/16/2020
by   Leonardo F. R. Ribeiro, et al.
0

Graph-to-text generation, a subtask of data-to-text generation, aims to generate fluent texts from graph-based data. Many graph-to-text models have shown strong performance in this task employing specialized graph encoders. However, recent approaches employ large pretrained language models (PLMs) achieving state-of-the-art results in data-to-text generation. In this paper, we aim to investigate the impact of large PLMs in graph-to-text generation. We present a study across three graph domains: meaning representations, Wikipedia knowledge graphs (KGs) and scientific KGs. Our analysis shows that PLMs such as BART and T5 achieve state-of-the-art results in graph-to-text benchmarks without explicitly encoding the graph structure. We also demonstrate that task-adaptive pretraining strategies are beneficial to the target task, improving even further the state of the art in two benchmarks for graph-to-text generation. In a final analysis, we investigate possible reasons for the PLMs' success on graph-to-text tasks. We find evidence that their knowledge about the world gives them a big advantage, especially when generating texts from KGs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2021

Structural Adapters in Pretrained Language Models for AMR-to-text Generation

Previous work on text generation from graph-structured data relies on pr...
research
09/01/2019

Enhancing AMR-to-Text Generation with Dual Graph Representations

Generating text from graph-based data, such as Abstract Meaning Represen...
research
08/27/2021

ReGen: Reinforcement Learning for Text and Knowledge Base Generation using Pretrained Language Models

Automatic construction of relevant Knowledge Bases (KBs) from text, and ...
research
07/14/2023

Using Large Language Models for Zero-Shot Natural Language Generation from Knowledge Graphs

In any system that uses structured knowledge graph (KG) data as its unde...
research
02/12/2023

Investigating the Effect of Relative Positional Embeddings on AMR-to-Text Generation with Structural Adapters

Text generation from Abstract Meaning Representation (AMR) has substanti...
research
12/31/2020

Promoting Graph Awareness in Linearized Graph-to-Text Generation

Generating text from structured inputs, such as meaning representations ...
research
09/09/2021

Graphine: A Dataset for Graph-aware Terminology Definition Generation

Precisely defining the terminology is the first step in scientific commu...

Please sign up or login with your details

Forgot password? Click here to reset