Textual Enhanced Contrastive Learning for Solving Math Word Problems

11/29/2022
by   Yibin Shen, et al.
0

Solving math word problems is the task that analyses the relation of quantities and requires an accurate understanding of contextual natural language information. Recent studies show that current models rely on shallow heuristics to predict solutions and could be easily misled by small textual perturbations. To address this problem, we propose a Textual Enhanced Contrastive Learning framework, which enforces the models to distinguish semantically similar examples while holding different mathematical logic. We adopt a self-supervised manner strategy to enrich examples with subtle textual variance by textual reordering or problem re-construction. We then retrieve the hardest to differentiate samples from both equation and textual perspectives and guide the model to learn their representations. Experimental results show that our method achieves state-of-the-art on both widely used benchmark datasets and also exquisitely designed challenge datasets in English and Chinese. [Our code and data is available at <https://github.com/yiyunya/Textual_CL_MWP>]

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2021

Disentangled Contrastive Learning for Learning Robust Textual Representations

Although the self-supervised pre-training of transformer models has resu...
research
10/16/2021

Seeking Patterns, Not just Memorizing Procedures: Contrastive Learning for Solving Math Word Problems

Math Word Problem (MWP) solving needs to discover the quantitative relat...
research
05/25/2022

Contrastive Learning with Boosted Memorization

Self-supervised learning has achieved a great success in the representat...
research
05/03/2022

Contrastive Learning for Prompt-Based Few-Shot Language Learners

The impressive performance of GPT-3 using natural language prompts and i...
research
06/11/2021

Turn the Combination Lock: Learnable Textual Backdoor Attacks via Word Substitution

Recent studies show that neural natural language processing (NLP) models...
research
10/14/2021

Representation Decoupling for Open-Domain Passage Retrieval

Training dense passage representations via contrastive learning (CL) has...
research
08/22/2022

TaCo: Textual Attribute Recognition via Contrastive Learning

As textual attributes like font are core design elements of document for...

Please sign up or login with your details

Forgot password? Click here to reset