RAKA:Co-training of Relationships and Attributes for Cross-lingual Knowledge Alignment

10/29/2019
by   Bo Chen, et al.
0

Cross-lingual knowledge alignment suffers from the attribute heterogeneity when leveraging the attributes and also suffers from the conflicts when combing the results inferred from attributes and relationships. This paper proposes an interaction based attribute model to capture the attribute-level interactions for estimating entity similarities, eliminating the negative impact of the dissimilar attributes. A matrix-based strategy is adopted in the model to accelerate the similarity estimation. We further propose a co-training framework together with three merge strategies to combine the alignments inferred by the attribute model and the relationship model. The whole framework can effectively and efficiently infer the aligned entities, relationships, attributes, and values simultaneously. Experimental results on several cross-lingual knowledge datasets show that our model significantly outperforms the state-of-the-art comparison methods (improving 2.35-51.57 Ratio@1).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset