Dual Temperature Helps Contrastive Learning Without Many Negative Samples: Towards Understanding and Simplifying MoCo

03/30/2022
by   Chaoning Zhang, et al.
0

Contrastive learning (CL) is widely known to require many negative samples, 65536 in MoCo for instance, for which the performance of a dictionary-free framework is often inferior because the negative sample size (NSS) is limited by its mini-batch size (MBS). To decouple the NSS from the MBS, a dynamic dictionary has been adopted in a large volume of CL frameworks, among which arguably the most popular one is MoCo family. In essence, MoCo adopts a momentum-based queue dictionary, for which we perform a fine-grained analysis of its size and consistency. We point out that InfoNCE loss used in MoCo implicitly attract anchors to their corresponding positive sample with various strength of penalties and identify such inter-anchor hardness-awareness property as a major reason for the necessity of a large dictionary. Our findings motivate us to simplify MoCo v2 via the removal of its dictionary as well as momentum. Based on an InfoNCE with the proposed dual temperature, our simplified frameworks, SimMoCo and SimCo, outperform MoCo v2 by a visible margin. Moreover, our work bridges the gap between CL and non-CL frameworks, contributing to a more unified understanding of these two mainstream frameworks in SSL. Code is available at: https://bit.ly/3LkQbaT.

READ FULL TEXT
research
07/16/2022

Model-Aware Contrastive Learning: Towards Escaping Uniformity-Tolerance Dilemma in Training

Instance discrimination contrastive learning (CL) has achieved significa...
research
02/26/2022

Exploring the Impact of Negative Samples of Contrastive Learning: A Case Study of Sentence Embedding

Contrastive learning is emerging as a powerful technique for extracting ...
research
07/20/2022

Negative Samples are at Large: Leveraging Hard-distance Elastic Loss for Re-identification

We present a Momentum Re-identification (MoReID) framework that can leve...
research
04/28/2021

A Note on Connecting Barlow Twins with Negative-Sample-Free Contrastive Learning

In this report, we relate the algorithmic design of Barlow Twins' method...
research
12/15/2020

Understanding the Behaviour of Contrastive Loss

Unsupervised contrastive learning has achieved outstanding success, whil...
research
09/21/2023

DimCL: Dimensional Contrastive Learning For Improving Self-Supervised Learning

Self-supervised learning (SSL) has gained remarkable success, for which ...

Please sign up or login with your details

Forgot password? Click here to reset