Prototypical Graph Contrastive Learning

06/17/2021
by   Shuai Lin, et al.
5

Graph-level representations are critical in various real-world applications, such as predicting the properties of molecules. But in practice, precise graph annotations are generally very expensive and time-consuming. To address this issue, graph contrastive learning constructs instance discrimination task which pulls together positive pairs (augmentation pairs of the same graph) and pushes away negative pairs (augmentation pairs of different graphs) for unsupervised representation learning. However, since for a query, its negatives are uniformly sampled from all graphs, existing methods suffer from the critical sampling bias issue, i.e., the negatives likely having the same semantic structure with the query, leading to performance degradation. To mitigate this sampling bias issue, in this paper, we propose a Prototypical Graph Contrastive Learning (PGCL) approach. Specifically, PGCL models the underlying semantic structure of the graph data via clustering semantically similar graphs into the same group, and simultaneously encourages the clustering consistency for different augmentations of the same graph. Then given a query, it performs negative sampling via drawing the graphs from those clusters that differ from the cluster of query, which ensures the semantic difference between query and its negative samples. Moreover, for a query, PGCL further reweights its negative samples based on the distance between their prototypes (cluster centroids) and the query prototype such that those negatives having moderate prototype distance enjoy relatively large weights. This reweighting strategy is proved to be more effective than uniform sampling. Experimental results on various graph benchmarks testify the advantages of our PGCL over state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2021

Graph Communal Contrastive Learning

Graph representation learning is crucial for many real-world application...
research
09/28/2022

Efficient block contrastive learning via parameter-free meta-node approximation

Contrastive learning has recently achieved remarkable success in many do...
research
01/27/2021

Improving Graph Representation Learning by Contrastive Regularization

Graph representation learning is an important task with applications in ...
research
07/05/2023

Graph Contrastive Topic Model

Existing NTMs with contrastive learning suffer from the sample bias prob...
research
10/21/2022

GLCC: A General Framework for Graph-level Clustering

This paper studies the problem of graph-level clustering, which is a nov...
research
06/12/2023

CARL-G: Clustering-Accelerated Representation Learning on Graphs

Self-supervised learning on graphs has made large strides in achieving g...
research
02/11/2022

Conditional Contrastive Learning with Kernel

Conditional contrastive learning frameworks consider the conditional sam...

Please sign up or login with your details

Forgot password? Click here to reset