Utilizing Lexical Similarity between Related, Low-resource Languages for Pivot-based SMT

02/23/2017
by   Anoop Kunchukuttan, et al.
0

We investigate pivot-based translation between related languages in a low resource, phrase-based SMT setting. We show that a subword-level pivot-based SMT model using a related pivot language is substantially better than word and morpheme-level pivot models. It is also highly competitive with the best direct translation model, which is encouraging as no direct source-target training corpus is used. We also show that combining multiple related language pivot models can rival a direct translation model. Thus, the use of subwords as translation units coupled with multiple related pivot languages can compensate for the lack of a direct parallel corpus.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset