Exploring Equation as a Better Intermediate Meaning Representation for Numerical Reasoning

08/21/2023
by   Dingzirui Wang, et al.
0

Numerical reasoning is vital for natural language processing models to understand and process numerical information in real-world scenarios. Most current methods first generate the Intermediate Meaning Representations (IMRs) of questions and then generate answers. Current SOTA methods generate programs as IMRs with large language models (LLMs). Intuitively, equations have fewer restrictions and closer semantics to the question than programs, leading to higher generation accuracy. However, current LLMs generate equations worse than programs, where we assume that the equation data is rare in pre-training data compared to programs. So in this paper, we try to use equations as IMRs to solve the numerical reasoning task by addressing two problems: (1) Theoretically, how to prove that the equation is an IMR with higher generation accuracy than programs; (2) Empirically, how to improve the generation accuracy of equations with LLMs. For the first problem, we propose and prove a proposition to theoretically compare the generation accuracy of different IMRs. For the second problem, we present a method called Boosting Numerical Reasonby Decomposing the Generation of Equations (Bridge), which can improve the accuracy of LLMs in generating equations as IMRs by reducing the tendency of generating constant expressions and programs. Our method improves the performance by 2.2 compared to the previous state-of-the-art methods under the single reasoning path setting. Our codes and prompts are released in https://github.com/zirui-HIT/Bridge_for_Numerical_Reasoning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2020

Semantic Graphs for Generating Deep Questions

This paper proposes the problem of Deep Question Generation (DQG), which...
research
12/20/2022

Toward a Unified Framework for Unsupervised Complex Tabular Reasoning

Structured tabular data exist across nearly all fields. Reasoning task o...
research
08/08/2023

Cumulative Reasoning with Large Language Models

While language models are powerful and versatile, they often fail to add...
research
11/15/2022

Evaluating How Fine-tuning on Bimodal Data Effects Code Generation

Despite the increase in popularity of language models for code generatio...
research
11/02/2018

Semantically-Aligned Equation Generation for Solving and Reasoning Math Word Problems

Solving math word problems is a challenging task that requires accurate ...
research
06/15/2023

Learning by Analogy: Diverse Questions Generation in Math Word Problem

Solving math word problem (MWP) with AI techniques has recently made gre...
research
09/21/2022

Seeking Diverse Reasoning Logic: Controlled Equation Expression Generation for Solving Math Word Problems

To solve Math Word Problems, human students leverage diverse reasoning l...

Please sign up or login with your details

Forgot password? Click here to reset