Composing Knowledge Graph Embeddings via Word Embeddings
Learning knowledge graph embedding from an existing knowledge graph is very important to knowledge graph completion. For a fact (h,r,t) with the head entity h having a relation r with the tail entity t, the current approaches aim to learn low dimensional representations (h,r,t), each of which corresponds to the elements in (h, r, t), respectively. As (h,r,t) is learned from the existing facts within a knowledge graph, these representations can not be used to detect unknown facts (if the entities or relations never occur in the knowledge graph). This paper proposes a new approach called TransW, aiming to go beyond the current work by composing knowledge graph embeddings using word embeddings. Given the fact that an entity or a relation contains one or more words (quite often), it is sensible to learn a mapping function from word embedding spaces to knowledge embedding spaces, which shows how entities are constructed using human words. More importantly, composing knowledge embeddings using word embeddings makes it possible to deal with the emerging new facts (either new entities or relations). Experimental results using three public datasets show the consistency and outperformance of the proposed TransW.
READ FULL TEXT