Self-supervised Learning for Heterogeneous Graph via Structure Information based on Metapath

by   Shuai Ma, et al.

graph neural networks (GNNs) are the dominant paradigm for modeling and handling graph structure data by learning universal node representation. The traditional way of training GNNs depends on a great many labeled data, which results in high requirements on cost and time. In some special scene, it is even unavailable and impracticable. Self-supervised representation learning, which can generate labels by graph structure data itself, is a potential approach to tackle this problem. And turning to research on self-supervised learning problem for heterogeneous graphs is more challenging than dealing with homogeneous graphs, also there are fewer studies about it. In this paper, we propose a SElfsupervised learning method for heterogeneous graph via Structure Information based on Metapath (SESIM). The proposed model can construct pretext tasks by predicting jump number between nodes in each metapath to improve the representation ability of primary task. In order to predict jump number, SESIM uses data itself to generate labels, avoiding time-consuming manual labeling. Moreover, predicting jump number in each metapath can effectively utilize graph structure information, which is the essential property between nodes. Therefore, SESIM deepens the understanding of models for graph structure. At last, we train primary task and pretext tasks jointly, and use meta-learning to balance the contribution of pretext tasks for primary task. Empirical results validate the performance of SESIM method and demonstrate that this method can improve the representation ability of traditional neural networks on link prediction task and node classification task.


page 1

page 2

page 3

page 4


Self-supervised Auxiliary Learning for Graph Neural Networks via Meta-Learning

In recent years, graph neural networks (GNNs) have been widely adopted i...

Self-supervised Auxiliary Learning with Meta-paths for Heterogeneous Graphs

Graph neural networks have shown superior performance in a wide range of...

MGAE: Masked Autoencoders for Self-Supervised Learning on Graphs

We introduce a novel masked graph autoencoder (MGAE) framework to perfor...

Node Feature Extraction by Self-Supervised Multi-scale Neighborhood Prediction

Learning on graphs has attracted significant attention in the learning c...

Boosting Graph Structure Learning with Dummy Nodes

With the development of graph kernels and graph representation learning,...

Graph Structural Residuals: A Learning Approach to Diagnosis

Traditional model-based diagnosis relies on constructing explicit system...

Bag Graph: Multiple Instance Learning using Bayesian Graph Neural Networks

Multiple Instance Learning (MIL) is a weakly supervised learning problem...

Please sign up or login with your details

Forgot password? Click here to reset