Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon Induction

06/06/2021
by   Jinpeng Zhang, et al.
0

Bilingual Lexicon Induction (BLI) aims to map words in one language to their translations in another, and is typically through learning linear projections to align monolingual word representation spaces. Two classes of word representations have been explored for BLI: static word embeddings and contextual representations, but there is no studies to combine both. In this paper, we propose a simple yet effective mechanism to combine the static word embeddings and the contextual representations to utilize the advantages of both paradigms. We test the combination mechanism on various language pairs under the supervised and unsupervised BLI benchmark settings. Experiments show that our mechanism consistently improves performances over robust BLI baselines on all language pairs by averagely improving 3.2 points in the supervised setting, and 3.1 points in the unsupervised setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2019

Cross-Lingual Contextual Word Embeddings Mapping With Multi-Sense Words In Mind

Recent work in cross-lingual contextual word embedding learning cannot h...
research
01/01/2021

Bilingual Lexicon Induction via Unsupervised Bitext Construction and Word Alignment

Bilingual lexicons map words in one language to their translations in an...
research
03/23/2015

Unsupervised POS Induction with Word Embeddings

Unsupervised word embeddings have been shown to be valuable as features ...
research
03/03/2018

Understanding and Improving Multi-Sense Word Embeddings via Extended Robust Principal Component Analysis

Unsupervised learned representations of polysemous words generate a larg...
research
07/06/2017

A Simple Approach to Learn Polysemous Word Embeddings

Many NLP applications require disambiguating polysemous words. Existing ...
research
06/08/2021

Obtaining Better Static Word Embeddings Using Contextual Embedding Models

The advent of contextual word embeddings – representations of words whic...
research
04/26/2022

From Hyperbolic Geometry Back to Word Embeddings

We choose random points in the hyperbolic disc and claim that these poin...

Please sign up or login with your details

Forgot password? Click here to reset