Tackling Oversmoothing of GNNs with Contrastive Learning

by   Lecheng Zheng, et al.

Graph neural networks (GNNs) integrate the comprehensive relation of graph data and the representation learning capability of neural networks, which is one of the most popular deep learning methods and achieves state-of-the-art performance in many applications, such as natural language processing and computer vision. In real-world scenarios, increasing the depth (i.e., the number of layers) of GNNs is sometimes necessary to capture more latent knowledge of the input data to mitigate the uncertainty caused by missing values. However, involving more complex structures and more parameters will decrease the performance of GNN models. One reason called oversmoothing is recently introduced but the relevant research remains nascent. In general, oversmoothing makes the final representations of nodes indiscriminative, thus deteriorating the node classification and link prediction performance. In this paper, we first survey the current de-oversmoothing methods and propose three major metrics to evaluate a de-oversmoothing method, i.e., constant divergence indicator, easy-to-determine divergence indicator, and model-agnostic strategy. Then, we propose the Topology-guided Graph Contrastive Layer, named TGCL, which is the first de-oversmoothing method maintaining all three mentioned metrics. With the contrastive learning manner, we provide the theoretical analysis of the effectiveness of the proposed TGCL. Last but not least, we design extensive experiments to illustrate the empirical performance of TGCL comparing with state-of-the-art baselines.


page 1

page 2

page 3

page 4


How Expressive are Graph Neural Networks in Recommendation?

Graph Neural Networks (GNNs) have demonstrated superior performance on v...

Dirichlet Energy Constrained Learning for Deep Graph Neural Networks

Graph neural networks (GNNs) integrate deep architectures and topologica...

Graph Neural Networks: Methods, Applications, and Opportunities

In the last decade or so, we have witnessed deep learning reinvigorating...

Scaling Up, Scaling Deep: Blockwise Graph Contrastive Learning

Oversmoothing is a common phenomenon in graph neural networks (GNNs), in...

Joint Embedding of Structural and Functional Brain Networks with Graph Neural Networks for Mental Illness Diagnosis

Multimodal brain networks characterize complex connectivities among diff...

MARIO: Model Agnostic Recipe for Improving OOD Generalization of Graph Contrastive Learning

In this work, we investigate the problem of out-of-distribution (OOD) ge...

ContraNorm: A Contrastive Learning Perspective on Oversmoothing and Beyond

Oversmoothing is a common phenomenon in a wide range of Graph Neural Net...

Please sign up or login with your details

Forgot password? Click here to reset