Efficient block contrastive learning via parameter-free meta-node approximation

by   Gayan K. Kulatilleke, et al.

Contrastive learning has recently achieved remarkable success in many domains including graphs. However contrastive loss, especially for graphs, requires a large number of negative samples which is unscalable and computationally prohibitive with a quadratic time complexity. Sub-sampling is not optimal and incorrect negative sampling leads to sampling bias. In this work, we propose a meta-node based approximation technique that can (a) proxy all negative combinations (b) in quadratic cluster size time complexity, (c) at graph level, not node level, and (d) exploit graph sparsity. By replacing node-pairs with additive cluster-pairs, we compute the negatives in cluster-time at graph level. The resulting Proxy approximated meta-node Contrastive (PamC) loss, based on simple optimized GPU operations, captures the full set of negatives, yet is efficient with a linear time complexity. By avoiding sampling, we effectively eliminate sample bias. We meet the criterion for larger number of samples, thus achieving block-contrastiveness, which is proven to outperform pair-wise losses. We use learnt soft cluster assignments for the meta-node constriction, and avoid possible heterophily and noise added during edge creation. Theoretically, we show that real world graphs easily satisfy conditions necessary for our approximation. Empirically, we show promising accuracy gains over state-of-the-art graph clustering on 6 benchmarks. Importantly, we gain substantially in efficiency; up to 3x in training time, 1.8x in inference time and over 5x in GPU memory reduction.


page 3

page 8


Prototypical Graph Contrastive Learning

Graph-level representations are critical in various real-world applicati...

Graph Communal Contrastive Learning

Graph representation learning is crucial for many real-world application...

CARL-G: Clustering-Accelerated Representation Learning on Graphs

Self-supervised learning on graphs has made large strides in achieving g...

Twin Contrastive Learning for Online Clustering

This paper proposes to perform online clustering by conducting twin cont...

Contrastive Clustering

In this paper, we propose a one-stage online clustering method called Co...

Can Single-Pass Contrastive Learning Work for Both Homophilic and Heterophilic Graph?

Existing graph contrastive learning (GCL) typically requires two forward...

Graph Matching with Bi-level Noisy Correspondence

In this paper, we study a novel and widely existing problem in graph mat...

Please sign up or login with your details

Forgot password? Click here to reset