ScoreCL: Augmentation-Adaptive Contrastive Learning via Score-Matching Function

06/07/2023
by   Jinyoung Kim, et al.
2

Self-supervised contrastive learning (CL) has achieved state-of-the-art performance in representation learning by minimizing the distance between positive pairs while maximizing that of negative ones. Recently, it has been verified that the model learns better representation with diversely augmented positive pairs because they enable the model to be more view-invariant. However, only a few studies on CL have considered the difference between augmented views, and have not gone beyond the hand-crafted findings. In this paper, we first observe that the score-matching function can measure how much data has changed from the original through augmentation. With the observed property, every pair in CL can be weighted adaptively by the difference of score values, resulting in boosting the performance of the existing CL method. We show the generality of our method, referred to as ScoreCL, by consistently improving various CL methods, SimCLR, SimSiam, W-MSE, and VICReg, up to 3 k-NN evaluation on CIFAR-10, CIFAR-100, and ImageNet-100. Moreover, we have conducted exhaustive experiments and ablations, including results on diverse downstream tasks, comparison with possible baselines, and improvement when used with other proposed augmentation methods. We hope our exploration will inspire more research in exploiting the score matching for CL.

READ FULL TEXT

page 3

page 4

page 7

page 13

research
02/05/2023

CIPER: Combining Invariant and Equivariant Representations Using Contrastive and Predictive Learning

Self-supervised representation learning (SSRL) methods have shown great ...
research
04/01/2021

Composable Augmentation Encoding for Video Representation Learning

We focus on contrastive methods for self-supervised video representation...
research
02/07/2022

Crafting Better Contrastive Views for Siamese Representation Learning

Recent self-supervised contrastive learning methods greatly benefit from...
research
08/06/2021

Improving Contrastive Learning by Visualizing Feature Transformation

Contrastive learning, which aims at minimizing the distance between posi...
research
11/23/2020

Boosting Contrastive Self-Supervised Learning with False Negative Cancellation

Self-supervised representation learning has witnessed significant leaps ...
research
10/11/2021

Towards Demystifying Representation Learning with Non-contrastive Self-supervision

Non-contrastive methods of self-supervised learning (such as BYOL and Si...
research
03/23/2023

Adaptive Similarity Bootstrapping for Self-Distillation

Most self-supervised methods for representation learning leverage a cros...

Please sign up or login with your details

Forgot password? Click here to reset