WorldTree: A Corpus of Explanation Graphs for Elementary Science Questions supporting Multi-Hop Inference

by   Peter A. Jansen, et al.

Developing methods of automated inference that are able to provide users with compelling human-readable justifications for why the answer to a question is correct is critical for domains such as science and medicine, where user trust and detecting costly errors are limiting factors to adoption. One of the central barriers to training question answering models on explainable inference tasks is the lack of gold explanations to serve as training data. In this paper we present a corpus of explanations for standardized science exams, a recent challenge task for question answering. We manually construct a corpus of detailed explanations for nearly all publicly available standardized elementary science question (approximately 1,680 3rd through 5th grade questions) and represent these as "explanation graphs" -- sets of lexically overlapping sentences that describe how to arrive at the correct answer to a question through a combination of domain and world knowledge. We also provide an explanation-centered tablestore, a collection of semi-structured tables that contain the knowledge to construct these elementary science explanations. Together, these two knowledge resources map out a substantial portion of the knowledge required for answering and explaining elementary science exams, and provide both structured and free-text training data for the explainable inference task.


page 2

page 5

page 6

page 8


ExplanationLP: Abductive Reasoning for Explainable Science Question Answering

We propose a novel approach for answering and explaining multiple-choice...

Science Question Answering using Instructional Materials

We provide a solution for elementary science test using instructional ma...

Ranking Facts for Explaining Answers to Elementary Science Questions

In multiple-choice exams, students select one answer from among typicall...

On the Challenges of Evaluating Compositional Explanations in Multi-Hop Inference: Relevance, Completeness, and Expert Ratings

Building compositional explanations requires models to combine two or mo...

Multi-hop Inference for Sentence-level TextGraphs: How Challenging is Meaningfully Combining Information for Science Question Answering?

Question Answering for complex questions is often modeled as a graph con...

Red Dragon AI at TextGraphs 2019 Shared Task: Language Model Assisted Explanation Generation

The TextGraphs-13 Shared Task on Explanation Regeneration asked particip...

Unification-based Reconstruction of Explanations for Science Questions

The paper presents a framework to reconstruct explanations for multiple ...

Please sign up or login with your details

Forgot password? Click here to reset