Identifying Necessary Elements for BERT's Multilinguality

05/01/2020
by   Philipp Dufter, et al.
0

It has been shown that multilingual BERT (mBERT) yields high quality multilingual representations and enables effective zero-shot transfer. This is suprising given that mBERT does not use any kind of crosslingual signal during training. While recent literature has studied this effect, the exact reason for mBERT's multilinguality is still unknown. We aim to identify architectural properties of BERT as well as linguistic properties of languages that are necessary for BERT to become multilingual. To allow for fast experimentation we propose an efficient setup with small BERT models and synthetic as well as natural data. Overall, we identify six elements that are potentially necessary for BERT to be multilingual. Architectural factors that contribute to multilinguality are underparameterization, shared special tokens (e.g., "[CLS]"), shared position embeddings and replacing masked tokens with random tokens. Factors related to training data that are beneficial for multilinguality are similar word order and comparability of corpora.

READ FULL TEXT

page 6

page 12

research
04/17/2021

A multilabel approach to morphosyntactic probing

We introduce a multilabel probing task to assess the morphosyntactic rep...
research
04/28/2020

Extending Multilingual BERT to Low-Resource Languages

Multilingual BERT (M-BERT) has been a huge success in both supervised an...
research
10/20/2020

Language Representation in Multilingual BERT and its applications to improve Cross-lingual Generalization

A token embedding in multilingual BERT (m-BERT) contains both language a...
research
10/09/2021

An Isotropy Analysis in the Multilingual BERT Embedding Space

Several studies have explored various advantages of multilingual pre-tra...
research
11/08/2019

How Language-Neutral is Multilingual BERT?

Multilingual BERT (mBERT) provides sentence representations for 104 lang...
research
06/02/2020

Position Masking for Language Models

Masked language modeling (MLM) pre-training models such as BERT corrupt ...
research
10/12/2020

Zero-shot Entity Linking with Efficient Long Range Sequence Modeling

This paper considers the problem of zero-shot entity linking, in which a...

Please sign up or login with your details

Forgot password? Click here to reset