Compressing Word Embeddings

11/19/2015
by   Martin Andrews, et al.
0

Recent methods for learning vector space representations of words have succeeded in capturing fine-grained semantic and syntactic regularities using vector arithmetic. However, these vector space representations (created through large-scale text analysis) are typically stored verbatim, since their internal structure is opaque. Using word-analogy tests to monitor the level of detail stored in compressed re-representations of the same vector space, the trade-offs between the reduction in memory usage and expressiveness are investigated. A simple scheme is outlined that can reduce the memory footprint of a state-of-the-art embedding by a factor of 10, with only minimal impact on performance. Then, using the same `bit budget', a binary (approximate) factorisation of the same space is also explored, with the aim of creating an equivalent representation with better interpretability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/06/2016

Desiderata for Vector-Space Word Representations

A plethora of vector-space representations for words is currently availa...
research
09/21/2016

Gov2Vec: Learning Distributed Representations of Institutions and Their Legal Text

We compare policy differences across institutions by embedding represent...
research
04/27/2015

Document Classification by Inversion of Distributed Language Representations

There have been many recent advances in the structure and measurement of...
research
07/31/2015

A Visual Embedding for the Unsupervised Extraction of Abstract Semantics

Vector-space word representations obtained from neural network models ha...
research
10/08/2022

Semantic Representations of Mathematical Expressions in a Continuous Vector Space

Mathematical notation makes up a large portion of STEM literature, yet, ...
research
07/17/2023

Vocoder drift compensation by x-vector alignment in speaker anonymisation

For the most popular x-vector-based approaches to speaker anonymisation,...
research
01/01/2021

Key Phrase Extraction Applause Prediction

With the increase in content availability over the internet it is very d...

Please sign up or login with your details

Forgot password? Click here to reset