Scale-dependent Relationships in Natural Language

12/16/2019
by   Aakash Sarkar, et al.
0

Natural language exhibits statistical dependencies at a wide range of scales. For instance, the mutual information between words in natural language decays like a power law with the temporal lag between them. However, many statistical learning models applied to language impose a sampling scale while extracting statistical structure. For instance, Word2Vec constructs a vector embedding that maximizes the prediction between a target word and the context words that appear nearby in the corpus. The size of the context is chosen by the user and defines a strong scale; relationships over much larger temporal scales would be invisible to the algorithm. This paper examines the family of Word2Vec embeddings generated while systematically manipulating the sampling scale used to define the context around each word. The primary result is that different linguistic relationships are preferentially encoded at different scales. Different scales emphasize different syntactic and semantic relations between words.Moreover, the neighborhoods of a given word in the embeddings change significantly depending on the scale. These results suggest that any individual scale can only identify a subset of the meaningful relationships a word might have, and point toward the importance of developing scale-free models of semantic meaning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2020

Word Embeddings: Stability and Semantic Change

Word embeddings are computed by a class of techniques within natural lan...
research
02/05/2018

Semantic projection: recovering human knowledge of multiple, distinct object features from word embeddings

The words of a language reflect the structure of the human mind, allowin...
research
07/19/2018

Imparting Interpretability to Word Embeddings

As an ubiquitous method in natural language processing, word embeddings ...
research
01/22/2019

Enhancing Semantic Word Representations by Embedding Deeper Word Relationships

Word representations are created using analogy context-based statistics ...
research
11/09/2020

Language Through a Prism: A Spectral Approach for Multiscale Language Representations

Language exhibits structure at different scales, ranging from subwords t...
research
03/21/2023

In-depth analysis of music structure as a self-organized network

Words in a natural language not only transmit information but also evolv...
research
03/02/2018

Hybrid Model For Word Prediction Using Naive Bayes and Latent Information

Historically, the Natural Language Processing area has been given too mu...

Please sign up or login with your details

Forgot password? Click here to reset