Studying Generalization on Memory-Based Methods in Continual Learning

06/16/2023
by   Felipe del Rio, et al.
8

One of the objectives of Continual Learning is to learn new concepts continually over a stream of experiences and at the same time avoid catastrophic forgetting. To mitigate complete knowledge overwriting, memory-based methods store a percentage of previous data distributions to be used during training. Although these methods produce good results, few studies have tested their out-of-distribution generalization properties, as well as whether these methods overfit the replay memory. In this work, we show that although these methods can help in traditional in-distribution generalization, they can strongly impair out-of-distribution generalization by learning spurious features and correlations. Using a controlled environment, the Synbol benchmark generator (Lacoste et al., 2020), we demonstrate that this lack of out-of-distribution generalization mainly occurs in the linear classifier.

READ FULL TEXT

page 3

page 4

page 8

page 9

page 10

page 11

research
10/12/2020

Rethinking Experience Replay: a Bag of Tricks for Continual Learning

In Continual Learning, a Neural Network is trained on a stream of data w...
research
09/18/2023

Analysis of the Memorization and Generalization Capabilities of AI Agents: Are Continual Learners Robust?

In continual learning (CL), an AI agent (e.g., autonomous vehicles or ro...
research
12/26/2021

Generative Kernel Continual learning

Kernel continual learning by <cit.> has recently emerged as a strong con...
research
09/28/2021

Formalizing the Generalization-Forgetting Trade-off in Continual Learning

We formulate the continual learning (CL) problem via dynamic programming...
research
07/04/2022

It's all About Consistency: A Study on Memory Composition for Replay-Based Methods in Continual Learning

Continual Learning methods strive to mitigate Catastrophic Forgetting (C...
research
01/13/2021

EEC: Learning to Encode and Regenerate Images for Continual Learning

The two main impediments to continual learning are catastrophic forgetti...
research
10/09/2021

Cognitively Inspired Learning of Incremental Drifting Concepts

Humans continually expand their learned knowledge to new domains and lea...

Please sign up or login with your details

Forgot password? Click here to reset