Exploring Effects of Random Walk Based Minibatch Selection Policy on Knowledge Graph Completion

by   Bishal Santra, et al.

In this paper, we have explored the effects of different minibatch sampling techniques in Knowledge Graph Completion. Knowledge Graph Completion (KGC) or Link Prediction is the task of predicting missing facts in a knowledge graph. KGC models are usually trained using margin, soft-margin or cross-entropy loss function that promotes assigning a higher score or probability for true fact triplets. Minibatch gradient descent is used to optimize these loss functions for training the KGC models. But, as each minibatch consists of only a few randomly sampled triplets from a large knowledge graph, any entity that occurs in a minibatch, occurs only once in most cases. Because of this, these loss functions ignore all other neighbors of any entity, whose embedding is being updated at some minibatch step. In this paper, we propose a new random-walk based minibatch sampling technique for training KGC models that optimizes the loss incurred by a minibatch of closely connected subgraph of triplets instead of randomly selected ones. We have shown results of experiments for different models and datasets with our sampling technique and found that the proposed sampling algorithm has varying effects on these datasets/models. Specifically, we find that our proposed method achieves state-of-the-art performance on the DB100K dataset.


page 1

page 2

page 3

page 4


BESS: Balanced Entity Sampling and Sharing for Large-Scale Knowledge Graph Completion

We present the award-winning submission to the WikiKG90Mv2 track of OGB-...

DSKG: A Deep Sequential Model for Knowledge Graph Completion

Knowledge graph (KG) completion aims to fill the missing facts in a KG, ...

Unified Interpretation of Softmax Cross-Entropy and Negative Sampling: With Case Study for Knowledge Graph Embedding

In knowledge graph embedding, the theoretical relationship between the s...

KBGAN: Adversarial Learning for Knowledge Graph Embeddings

We introduce an adversarial learning framework, which we named KBGAN, to...

On the Knowledge Graph Completion Using Translation Based Embedding: The Loss Is as Important as the Score

Knowledge graphs (KGs) represent world's facts in structured forms. KG c...

Enhancing Knowledge Graph Embedding Models with Semantic-driven Loss Functions

Knowledge graph embedding models (KGEMs) are used for various tasks rela...

Adaptive Margin Ranking Loss for Knowledge Graph Embeddings via a Correntropy Objective Function

Translation-based embedding models have gained significant attention in ...

Please sign up or login with your details

Forgot password? Click here to reset