Self-supervised Learning and Graph Classification under Heterophily

06/14/2023
by   Yilin Ding, et al.
0

Self-supervised learning has shown its promising capability in graph representation learning in recent work. Most existing pre-training strategies usually choose the popular Graph neural networks (GNNs), which can be seen as a special form of low-pass filter, fail to effectively capture heterophily. In this paper, we first present an experimental investigation exploring the performance of low-pass and high-pass filters in heterophily graph classification, where the results clearly show that high-frequency signal is important for learning heterophily graph representation. On the other hand, it is still unclear how to effectively capture the structural pattern of graphs and how to measure the capability of the self-supervised pre-training strategy in capturing graph structure. To address the problem, we first design a quantitative metric to Measure Graph Structure (MGS), which analyzes correlation between structural similarity and embedding similarity of graph pairs. Then, to enhance the graph structural information captured by self-supervised learning, we propose a novel self-supervised strategy for Pre-training GNNs based on the Metric (PGM). Extensive experiments validate our pre-training strategy achieves state-of-the-art performance for molecular property prediction and protein function prediction. In addition, we find choosing the suitable filter sometimes may be better than designing good pre-training strategies for heterophily graph classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2021

Pairwise Half-graph Discrimination: A Simple Graph-level Self-supervised Strategy for Pre-training Graph Neural Networks

Self-supervised learning has gradually emerged as a powerful technique f...
research
12/08/2022

Alleviating neighbor bias: augmenting graph self-supervise learning with structural equivalent positive samples

In recent years, using a self-supervised learning framework to learn the...
research
10/03/2021

Motif-based Graph Self-Supervised Learning forMolecular Property Prediction

Predicting molecular properties with data-driven methods has drawn much ...
research
10/19/2022

Self-supervised Heterogeneous Graph Pre-training Based on Structural Clustering

Recent self-supervised pre-training methods on Heterogeneous Information...
research
11/19/2021

Dynamic Graph Representation Learning via Graph Transformer Networks

Dynamic graph representation learning is an important task with widespre...
research
08/24/2021

Graph Contrastive Pre-training for Effective Theorem Reasoning

Interactive theorem proving is a challenging and tedious process, which ...
research
05/25/2021

Graph Self Supervised Learning: the BT, the HSIC, and the VICReg

Self-supervised learning and pre-training strategies have developed over...

Please sign up or login with your details

Forgot password? Click here to reset