Paraphrase Thought: Sentence Embedding Module Imitating Human Language Recognition

08/16/2018
by   Myeongjun Jang, et al.
0

Sentence embedding is an important research topic in natural language processing. It is essential to generate a good embedding vector that fully reflects the semantic meaning of a sentence in order to achieve an enhanced performance for various natural language processing tasks, such as machine translation and document classification. Thus far, various sentence embedding models have been proposed, and their feasibility has been demonstrated through good performances on tasks following embedding, such as sentiment analysis and sentence classification. However, because the performances of sentence classification and sentiment analysis can be enhanced by using a simple sentence representation method, it is not sufficient to claim that these models fully reflect the meanings of sentences based on good performances for such tasks. In this paper, inspired by human language recognition, we propose the following concept of semantic coherence, which should be satisfied for a good sentence embedding method: similar sentences should be located close to each other in the embedding space. Then, we propose the Paraphrase-Thought (P-thought) model to pursue semantic coherence as much as possible. Experimental results on two paraphrase identification datasets (MS COCO and STS benchmark) show that the P-thought models outperform the benchmarked sentence embedding methods.

READ FULL TEXT
research
01/16/2019

Sentence transition matrix: An efficient approach that preserves sentence semantics

Sentence embedding is a significant research topic in the field of natur...
research
10/08/2019

In Search for Linear Relations in Sentence Embedding Spaces

We present an introductory investigation into continuous-space vector re...
research
06/04/2023

Sen2Pro: A Probabilistic Perspective to Sentence Embedding from Pre-trained Language Model

Sentence embedding is one of the most fundamental tasks in Natural Langu...
research
06/09/2017

Trimming and Improving Skip-thought Vectors

The skip-thought model has been proven to be effective at learning sente...
research
05/09/2018

Decoding Decoders: Finding Optimal Representation Spaces for Unsupervised Similarity Tasks

Experimental evidence indicates that simple models outperform complex de...
research
07/26/2021

Thought Flow Nets: From Single Predictions to Trains of Model Thought

When humans solve complex problems, they rarely come up with a decision ...
research
05/14/2011

Semantic Vector Machines

We first present our work in machine translation, during which we used a...

Please sign up or login with your details

Forgot password? Click here to reset