Near or Far, Wide Range Zero-Shot Cross-Lingual Dependency Parsing
Cross-lingual transfer is the major means toleverage knowledge from high-resource lan-guages to help low-resource languages. In this paper, we investigate cross-lingual trans-fer across a broad spectrum of language dis-tances. We posit that Recurrent Neural Net-works (RNNs)-based encoders, since explic-itly incorporating surface word order, are brit-tle for transferring across distant languages,while self-attentive models are more flexibleon modeling word order information; thusthey would be more robust in the cross-lingualtransfer setting. We test our hypothesis bytraining dependency parsers on only Englishcorpus and evaluating them on 31 other lan-guages. With detailed analysis, we find inter-esting patterns showing that RNNs-based ar-chitectures can transfer well for languages thatare close to English, while self-attentive mod-els are have better cross-lingual transferabilityacross a wide range of languages.
READ FULL TEXT