Efficient Contrastive Learning via Novel Data Augmentation and Curriculum Learning

09/10/2021
by   Seonghyeon Ye, et al.
20

We introduce EfficientCL, a memory-efficient continual pretraining method that applies contrastive learning with novel data augmentation and curriculum learning. For data augmentation, we stack two types of operation sequentially: cutoff and PCA jittering. While pretraining steps proceed, we apply curriculum learning by incrementing the augmentation degree for each difficulty step. After data augmentation is finished, contrastive learning is applied on projected embeddings of original and augmented examples. When finetuned on GLUE benchmark, our model outperforms baseline models, especially for sentence-level tasks. Additionally, this improvement is capable with only 70 memory compared to the baseline model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/26/2021

Unsupervised Document Embedding via Contrastive Augmentation

We present a contrasting learning approach with data augmentation techni...
research
12/27/2020

Domain Generalisation with Domain Augmented Supervised Contrastive Learning (Student Abstract)

Domain generalisation (DG) methods address the problem of domain shift, ...
research
09/17/2021

Mitigating Data Scarceness through Data Synthesis, Augmentation and Curriculum for Abstractive Summarization

This paper explores three simple data manipulation techniques (synthesis...
research
08/17/2022

PCC: Paraphrasing with Bottom-k Sampling and Cyclic Learning for Curriculum Data Augmentation

Curriculum Data Augmentation (CDA) improves neural models by presenting ...
research
09/10/2023

Boosting Unsupervised Contrastive Learning Using Diffusion-Based Data Augmentation From Scratch

Unsupervised contrastive learning methods have recently seen significant...
research
09/30/2021

CrossAug: A Contrastive Data Augmentation Method for Debiasing Fact Verification Models

Fact verification datasets are typically constructed using crowdsourcing...
research
06/17/2020

Unsupervised Learning of Visual Features by Contrasting Cluster Assignments

Unsupervised image representations have significantly reduced the gap wi...

Please sign up or login with your details

Forgot password? Click here to reset