Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking

10/19/2020
by   Elliot Schumacher, et al.
0

Cross-language entity linking grounds mentions in multiple languages to a single-language knowledge base. We propose a neural ranking architecture for this task that uses multilingual BERT representations of the mention and the context in a neural network. We find that the multilingual ability of BERT leads to robust performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We investigate the zero-shot degradation and find that it can be partially mitigated by a proposed auxiliary training objective, but that the remaining error can best be attributed to domain shift rather than language transfer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset