Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings

04/17/2020
by   Lea Dieudonat, et al.
0

“Classical” word embeddings, such as Word2Vec, have been shown to capture the semantics of words based on their distributional properties. However, their ability to represent the different meanings that a word may have is limited. Such approaches also do not explicitly encode relations between entities, as denoted by words. Embeddings of knowledge bases (KB) capture the explicit relations between entities denoted by words, but are not able to directly capture the syntagmatic properties of these words. To our knowledge, recent research have focused on representation learning that augment the strengths of one with the other. In this work, we begin exploring another approach using contextual and KB embeddings jointly at the same level and propose two tasks – an entity typing and a relation typing task – that evaluate the performance of contextual and KB embeddings. We also evaluated a concatenated model of contextual and KB embeddings with these two tasks, and obtain conclusive results on the first task. We hope our work may contribute as a basis for models and datasets that develop in the direction of this approach.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset