COINS: Dynamically Generating COntextualized Inference Rules for Narrative Story Completion

06/04/2021
by   Debjit Paul, et al.
3

Despite recent successes of large pre-trained language models in solving reasoning tasks, their inference capabilities remain opaque. We posit that such models can be made more interpretable by explicitly generating interim inference rules, and using them to guide the generation of task-specific textual outputs. In this paper we present COINS, a recursive inference framework that i) iteratively reads context sentences, ii) dynamically generates contextualized inference rules, encodes them, and iii) uses them to guide task-specific output generation. We apply COINS to a Narrative Story Completion task that asks a model to complete a story with missing sentences, to produce a coherent story with plausible logical connections, causal relationships, and temporal dependencies. By modularizing inference and sentence generation steps in a recurrent model, we aim to make reasoning steps and their effects on next sentence generation transparent. Our automatic and manual evaluations show that the model generates better story sentences than SOTA baselines, especially in terms of coherence. We further demonstrate improved performance over strong pre-trained LMs in generating commonsense inference rules. The recursive nature of COINS holds the potential for controlled generation of longer sequences.

READ FULL TEXT
research
11/01/2022

Towards Inter-character Relationship-driven Story Generation

In this paper, we introduce the task of modeling interpersonal relations...
research
12/16/2022

Neural Story Planning

Automated plot generation is the challenge of generating a sequence of e...
research
01/24/2023

Can Very Large Pretrained Language Models Learn Storytelling With A Few Examples?

While pre-trained language models can generate individually fluent sente...
research
08/17/2020

Narrative Interpolation for Generating and Understanding Stories

We propose a method for controlled narrative/story generation where we a...
research
01/15/2020

A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

Story generation, namely generating a reasonable story from a leading co...
research
06/07/2023

World Models for Math Story Problems

Solving math story problems is a complex task for students and NLP model...
research
02/18/2022

CLSEG: Contrastive Learning of Story Ending Generation

Story Ending Generation (SEG) is a challenging task in natural language ...

Please sign up or login with your details

Forgot password? Click here to reset