Streaming Word Embeddings with the Space-Saving Algorithm

04/24/2017
by   Chandler May, et al.
0

We develop a streaming (one-pass, bounded-memory) word embedding algorithm based on the canonical skip-gram with negative sampling algorithm implemented in word2vec. We compare our streaming algorithm to word2vec empirically by measuring the cosine similarity between word pairs under each algorithm and by applying each algorithm in the downstream task of hashtag prediction on a two-month interval of the Twitter sample stream. We then discuss the results of these experiments, concluding they provide partial validation of our approach as a streaming replacement for word2vec. Finally, we discuss potential failure modes and suggest directions for future work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2016

Bayesian Neural Word Embedding

Recently, several works in the domain of natural language processing pre...
research
05/27/2017

word2vec Skip-Gram with Negative Sampling is a Weighted Logistic PCA

We show that the skip-gram formulation of word2vec trained with negative...
research
01/13/2022

Compressing Word Embeddings Using Syllables

This work examines the possibility of using syllable embeddings, instead...
research
11/11/2017

Interpretable probabilistic embeddings: bridging the gap between topic models and neural networks

We consider probabilistic topic models and more recent word embedding te...
research
12/19/2022

Norm of word embedding encodes information gain

Distributed representations of words encode lexical semantic information...
research
12/11/2019

Character 3-gram Mover's Distance: An Effective Method for Detecting Near-duplicate Japanese-language Recipes

In websites that collect user-generated recipes, recipes are often poste...

Please sign up or login with your details

Forgot password? Click here to reset