Semi-supervised Contrastive Learning with Similarity Co-calibration

05/16/2021
by   Yuhang Zhang, et al.
0

Semi-supervised learning acts as an effective way to leverage massive unlabeled data. In this paper, we propose a novel training strategy, termed as Semi-supervised Contrastive Learning (SsCL), which combines the well-known contrastive loss in self-supervised learning with the cross entropy loss in semi-supervised learning, and jointly optimizes the two objectives in an end-to-end way. The highlight is that different from self-training based semi-supervised learning that conducts prediction and retraining over the same model weights, SsCL interchanges the predictions over the unlabeled data between the two branches, and thus formulates a co-calibration procedure, which we find is beneficial for better prediction and avoid being trapped in local minimum. Towards this goal, the contrastive loss branch models pairwise similarities among samples, using the nearest neighborhood generated from the cross entropy branch, and in turn calibrates the prediction distribution of the cross entropy branch with the contrastive similarity. We show that SsCL produces more discriminative representation and is beneficial to few shot learning. Notably, on ImageNet with ResNet50 as the backbone, SsCL achieves 60.2 which significantly outperforms the baseline, and is better than previous semi-supervised and self-supervised methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2023

Semi-Supervised Relational Contrastive Learning

Disease diagnosis from medical images via supervised learning is usually...
research
01/08/2023

Learning the Relation between Similarity Loss and Clustering Loss in Self-Supervised Learning

Self-supervised learning enables networks to learn discriminative featur...
research
06/17/2020

LSD-C: Linearly Separable Deep Clusters

We present LSD-C, a novel method to identify clusters in an unlabeled da...
research
09/29/2021

Cross-domain Semi-Supervised Audio Event Classification Using Contrastive Regularization

In this study, we proposed a novel semi-supervised training method that ...
research
02/01/2022

Mars Terrain Segmentation with Less Labels

Planetary rover systems need to perform terrain segmentation to identify...
research
10/14/2021

Self-Supervised Learning by Estimating Twin Class Distributions

We present TWIST, a novel self-supervised representation learning method...
research
05/16/2022

Sharp Asymptotics of Self-training with Linear Classifier

Self-training (ST) is a straightforward and standard approach in semi-su...

Please sign up or login with your details

Forgot password? Click here to reset