ORDisCo: Effective and Efficient Usage of Incremental Unlabeled Data for Semi-supervised Continual Learning

01/02/2021
by   Liyuan Wang, et al.
6

Continual learning usually assumes the incoming data are fully labeled, which might not be applicable in real applications. In this work, we consider semi-supervised continual learning (SSCL) that incrementally learns from partially labeled data. Observing that existing continual learning methods lack the ability to continually exploit the unlabeled data, we propose deep Online Replay with Discriminator Consistency (ORDisCo) to interdependently learn a classifier with a conditional generative adversarial network (GAN), which continually passes the learned data distribution to the classifier. In particular, ORDisCo replays data sampled from the conditional generator to the classifier in an online manner, exploiting unlabeled data in a time- and storage-efficient way. Further, to explicitly overcome the catastrophic forgetting of unlabeled data, we selectively stabilize parameters of the discriminator that are important for discriminating the pairs of old unlabeled data and their pseudo-labels predicted by the classifier. We extensively evaluate ORDisCo on various semi-supervised learning benchmark datasets for SSCL, and show that ORDisCo achieves significant performance improvement on SVHN, CIFAR10 and Tiny-ImageNet, compared to strong baselines.

READ FULL TEXT

page 5

page 8

page 13

research
01/23/2022

Learning to Predict Gradients for Semi-Supervised Continual Learning

A key challenge for machine intelligence is to learn new visual concepts...
research
01/11/2023

A Distinct Unsupervised Reference Model From The Environment Helps Continual Learning

The existing continual learning methods are mainly focused on fully-supe...
research
07/12/2022

Contrastive Learning for Online Semi-Supervised General Continual Learning

We study Online Continual Learning with missing labels and propose SemiC...
research
06/14/2020

Classify and Generate Reciprocally: Simultaneous Positive-Unlabelled Learning and Conditional Generation with Extra Data

The scarcity of class-labeled data is a ubiquitous bottleneck in a wide ...
research
04/03/2019

Unsupervised Continual Learning and Self-Taught Associative Memory Hierarchies

We first pose the Unsupervised Continual Learning (UCL) problem: learnin...
research
01/17/2022

The CLEAR Benchmark: Continual LEArning on Real-World Imagery

Continual learning (CL) is widely regarded as crucial challenge for life...
research
06/15/2022

Queried Unlabeled Data Improves and Robustifies Class-Incremental Learning

Class-incremental learning (CIL) suffers from the notorious dilemma betw...

Please sign up or login with your details

Forgot password? Click here to reset