Towards the Sparseness of Projection Head in Self-Supervised Learning

07/18/2023
by   Zeen Song, et al.
0

In recent years, self-supervised learning (SSL) has emerged as a promising approach for extracting valuable representations from unlabeled data. One successful SSL method is contrastive learning, which aims to bring positive examples closer while pushing negative examples apart. Many current contrastive learning approaches utilize a parameterized projection head. Through a combination of empirical analysis and theoretical investigation, we provide insights into the internal mechanisms of the projection head and its relationship with the phenomenon of dimensional collapse. Our findings demonstrate that the projection head enhances the quality of representations by performing contrastive loss in a projected subspace. Therefore, we propose an assumption that only a subset of features is necessary when minimizing the contrastive loss of a mini-batch of data. Theoretical analysis further suggests that a sparse projection head can enhance generalization, leading us to introduce SparseHead - a regularization term that effectively constrains the sparsity of the projection head, and can be seamlessly integrated with any self-supervised learning (SSL) approaches. Our experimental results validate the effectiveness of SparseHead, demonstrating its ability to improve the performance of existing contrastive methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2022

Understanding and Improving the Role of Projection Head in Self-Supervised Learning

Self-supervised learning (SSL) aims to produce useful feature representa...
research
05/12/2022

The Mechanism of Prediction Head in Non-contrastive Self-supervised Learning

Recently the surprising discovery of the Bootstrap Your Own Latent (BYOL...
research
06/08/2021

Provable Guarantees for Self-Supervised Deep Learning with Spectral Contrastive Loss

Recent works in self-supervised learning have advanced the state-of-the-...
research
01/28/2023

Deciphering the Projection Head: Representation Evaluation Self-supervised Learning

Self-supervised learning (SSL) aims to learn intrinsic features without ...
research
02/18/2023

Data-Efficient Contrastive Self-supervised Learning: Easy Examples Contribute the Most

Self-supervised learning (SSL) learns high-quality representations from ...
research
06/03/2022

On the duality between contrastive and non-contrastive self-supervised learning

Recent approaches in self-supervised learning of image representations c...
research
10/13/2021

Decoupled Contrastive Learning

Contrastive learning (CL) is one of the most successful paradigms for se...

Please sign up or login with your details

Forgot password? Click here to reset