Meta-Learning Representations for Continual Learning

05/29/2019
by   Khurram Javed, et al.
10

A continual learning agent should be able to build on top of existing knowledge to learn on new data quickly while minimizing forgetting. Current intelligent systems based on neural network function approximators arguably do the opposite---they are highly prone to forgetting and rarely trained to facilitate future learning. One reason for this poor behavior is that they learn from a representation that is not explicitly trained for these two goals. In this paper, we propose MRCL, an objective to explicitly learn representations that accelerate future learning and are robust to forgetting under online updates in continual learning. The idea is to optimize the representation such that online updates minimize error on all samples with little forgetting. We show that it is possible to learn representations that are more effective for online updating and that sparsity naturally emerges in these representations. Moreover, our method is complementary to existing continual learning strategies, like MER, which can learn more effectively from representations learned by our objective. Finally, we demonstrate that a basic online updating strategy with our learned representation is competitive with rehearsal based methods for continual learning. We release an implementation of our method at https://github.com/khurramjaved96/mrcl .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/30/2020

Bilevel Continual Learning

Continual learning aims to learn continuously from a stream of tasks and...
research
03/12/2020

Online Fast Adaptation and Knowledge Accumulation: a New Approach to Continual Learning

Learning from non-stationary data remains a great challenge for machine ...
research
08/20/2023

A Comprehensive Empirical Evaluation on Online Continual Learning

Online continual learning aims to get closer to a live learning experien...
research
03/03/2022

Provable and Efficient Continual Representation Learning

In continual learning (CL), the goal is to design models that can learn ...
research
06/18/2023

IF2Net: Innately Forgetting-Free Networks for Continual Learning

Continual learning can incrementally absorb new concepts without interfe...
research
07/16/2023

A Comprehensive Survey of Forgetting in Deep Learning Beyond Continual Learning

Forgetting refers to the loss or deterioration of previously acquired in...
research
10/10/2022

A Simple Baseline that Questions the Use of Pretrained-Models in Continual Learning

With the success of pretraining techniques in representation learning, a...

Please sign up or login with your details

Forgot password? Click here to reset