Multi-Source Domain Adaptation with Mixture of Experts

09/07/2018
by   Jiang Guo, et al.
0

We propose a mixture-of-experts approach for unsupervised domain adaptation from multiple sources. The key idea is to explicitly capture the relationship between a target example and different source domains. This relationship, expressed by a point-to-set metric, determines how to combine predictors trained on various domains. The metric is learned in an unsupervised fashion using meta-training. Experimental results on sentiment analysis and part-of-speech tagging demonstrate that our approach consistently outperforms multiple baselines and can robustly handle negative transfer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset