FILTER: An Enhanced Fusion Method for Cross-lingual Language Understanding

09/10/2020
by   Yuwei Fang, et al.
0

Large-scale cross-lingual language models (LM), such as mBERT, Unicoder and XLM, have achieved great success in cross-lingual representation learning. However, when applied to zero-shot cross-lingual transfer tasks, most existing methods use only single-language input for LM finetuning, without leveraging the intrinsic cross-lingual alignment between different languages that is essential for multilingual tasks. In this paper, we propose FILTER, an enhanced fusion method that takes cross-lingual data as input for XLM finetuning. Specifically, FILTER first encodes text input in the source language and its translation in the target language independently in the shallow layers, then performs cross-lingual fusion to extract multilingual knowledge in the intermediate layers, and finally performs further language-specific encoding. During inference, the model makes predictions based on the text input in the target language and its translation in the source language. For simple tasks such as classification, translated text in the target language shares the same label as the source language. However, this shared label becomes less accurate or even unavailable for more complex tasks such as question answering, NER and POS tagging. For better model scalability, we further propose an additional KL-divergence self-teaching loss for model training, based on auto-generated soft pseudo-labels for translated text in the target language. Extensive experiments demonstrate that FILTER achieves new state of the art on two challenging multilingual multi-task benchmarks, XTREME and XGLUE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2021

XeroAlign: Zero-Shot Cross-lingual Transformer Alignment

The introduction of pretrained cross-lingual language models brought dec...
research
09/16/2023

Contextual Label Projection for Cross-Lingual Structure Extraction

Translating training data into target languages has proven beneficial fo...
research
05/26/2020

English Intermediate-Task Training Improves Zero-Shot Cross-Lingual Transfer Too

Intermediate-task training has been shown to substantially improve pretr...
research
10/23/2020

Unsupervised Cross-lingual Adaptation for Sequence Tagging and Beyond

Cross-lingual adaptation with multilingual pre-trained language models (...
research
03/18/2020

X-Stance: A Multilingual Multi-Target Dataset for Stance Detection

We extract a large-scale stance detection dataset from comments written ...
research
09/16/2023

X-PARADE: Cross-Lingual Textual Entailment and Information Divergence across Paragraphs

Understanding when two pieces of text convey the same information is a g...
research
05/22/2023

Enhancing Cross-lingual Natural Language Inference by Soft Prompting with Multilingual Verbalizer

Cross-lingual natural language inference is a fundamental problem in cro...

Please sign up or login with your details

Forgot password? Click here to reset