Understanding Contrastive Learning Through the Lens of Margins

06/20/2023
by   Daniel Rho, et al.
0

Self-supervised learning, or SSL, holds the key to expanding the usage of machine learning in real-world tasks by alleviating heavy human supervision. Contrastive learning and its varieties have been SSL strategies in various fields. We use margins as a stepping stone for understanding how contrastive learning works at a deeper level and providing potential directions to improve representation learning. Through gradient analysis, we found that margins scale gradients in three different ways: emphasizing positive samples, de-emphasizing positive samples when angles of positive samples are wide, and attenuating the diminishing gradients as the estimated probability approaches the target probability. We separately analyze each and provide possible directions for improving SSL frameworks. Our experimental results demonstrate that these properties can contribute to acquiring better representations, which can enhance performance in both seen and unseen datasets.

READ FULL TEXT
research
10/28/2022

Improving the Modality Representation with Multi-View Contrastive Learning for Multimodal Sentiment Analysis

Modality representation learning is an important problem for multimodal ...
research
10/13/2020

MixCo: Mix-up Contrastive Learning for Visual Representation

Contrastive learning has shown remarkable results in recent self-supervi...
research
09/21/2023

DimCL: Dimensional Contrastive Learning For Improving Self-Supervised Learning

Self-supervised learning (SSL) has gained remarkable success, for which ...
research
03/25/2022

Chaos is a Ladder: A New Theoretical Understanding of Contrastive Learning via Augmentation Overlap

Recently, contrastive learning has risen to be a promising approach for ...
research
02/08/2021

Quantifying and Mitigating Privacy Risks of Contrastive Learning

Data is the key factor to drive the development of machine learning (ML)...
research
03/30/2022

How Does SimSiam Avoid Collapse Without Negative Samples? A Unified Understanding with Self-supervised Contrastive Learning

To avoid collapse in self-supervised learning (SSL), a contrastive loss ...
research
10/20/2022

Does Decentralized Learning with Non-IID Unlabeled Data Benefit from Self Supervision?

Decentralized learning has been advocated and widely deployed to make ef...

Please sign up or login with your details

Forgot password? Click here to reset