Lost in Embedding Space: Explaining Cross-Lingual Task Performance with Eigenvalue Divergence
Performance in cross-lingual NLP tasks is impacted by the (dis)similarity of languages at hand: e.g., previous work has suggested there is a connection between the expected success of bilingual lexicon induction (BLI) and the assumption of (approximate) isomorphism between monolingual embedding spaces. In this work, we present a large-scale study focused on the correlations between language similarity and task performance, covering thousands of language pairs and four different tasks: BLI, machine translation, parsing, and POS tagging. We propose a novel language distance measure, Eigenvalue Divergence (EVD), which quantifies the degree of isomorphism between two monolingual spaces. We empirically show that 1) language similarity scores derived from embedding-based EVD distances are strongly associated with performance observed in different cross-lingual tasks, 2) EVD outperforms other standard embedding-based language distance measures across the board, at the same time being computationally more tractable and easier to interpret. Finally, we demonstrate that EVD captures information which is complementary to typologically driven language distance measures. We report that their combination yields even higher correlations with performance levels in all cross-lingual tasks.
READ FULL TEXT