Learning Phrase Embeddings from Paraphrases with GRUs

10/13/2017
by   Zhihao Zhou, et al.
0

Learning phrase representations has been widely explored in many Natural Language Processing (NLP) tasks (e.g., Sentiment Analysis, Machine Translation) and has shown promising improvements. Previous studies either learn non-compositional phrase representations with general word embedding learning techniques or learn compositional phrase representations based on syntactic structures, which either require huge amounts of human annotations or cannot be easily generalized to all phrases. In this work, we propose to take advantage of large-scaled paraphrase database and present a pair-wise gated recurrent units (pairwise-GRU) framework to generate compositional phrase representations. Our framework can be re-used to generate representations for any phrases. Experimental results show that our framework achieves state-of-the-art results on several phrase similarity tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2016

Adaptive Joint Learning of Compositional and Non-Compositional Phrase Embeddings

We present a novel method for jointly learning compositional and non-com...
research
03/20/2023

Controllable Ancient Chinese Lyrics Generation Based on Phrase Prototype Retrieving

Generating lyrics and poems is one of the essential downstream tasks in ...
research
01/09/2017

Task-Specific Attentive Pooling of Phrase Alignments Contributes to Sentence Matching

This work studies comparatively two typical sentence matching tasks: tex...
research
06/04/2019

Improving Neural Language Models by Segmenting, Attending, and Predicting the Future

Common language models typically predict the next word given the context...
research
04/23/2016

Why and How to Pay Different Attention to Phrase Alignments of Different Intensities

This work studies comparatively two typical sentence pair classification...
research
06/15/2021

Semantic Representation and Inference for NLP

Semantic representation and inference is essential for Natural Language ...
research
05/25/2016

BattRAE: Bidimensional Attention-Based Recursive Autoencoders for Learning Bilingual Phrase Embeddings

In this paper, we propose a bidimensional attention based recursive auto...

Please sign up or login with your details

Forgot password? Click here to reset