More Is Better: An Analysis of Instance Quantity/Quality Trade-off in Rehearsal-based Continual Learning

05/28/2021
by   Francesco Pelosin, et al.
0

The design of machines and algorithms capable of learning in a dynamically changing environment has become an increasingly topical problem with the increase of the size and heterogeneity of data available to learning systems. As a consequence, the key issue of Continual Learning has become that of addressing the stability-plasticity dilemma of connectionist systems, as they need to adapt their model without forgetting previously acquired knowledge. Within this context, rehearsal-based methods i.e., solutions in where the learner exploits memory to revisit past data, has proven to be very effective, leading to performance at the state-of-the-art. In our study, we propose an analysis of the memory quantity/quality trade-off adopting various data reduction approaches to increase the number of instances storable in memory. In particular, we investigate complex instance compression techniques such as deep encoders, but also trivial approaches such as image resizing and linear dimensionality reduction. Our findings suggest that the optimal trade-off is severely skewed toward instance quantity, where rehearsal approaches with several heavily compressed instances easily outperform state-of-the-art approaches with the same amount of memory at their disposal. Further, in high memory configurations, deep approaches extracting spatial structure combined with extreme resizing (of the order of 8×8 images) yield the best results, while in memory-constrained configurations where deep approaches cannot be used due to their memory requirement in training, Extreme Learning Machines (ELM) offer a clear advantage.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2017

Gradient Episodic Memory for Continual Learning

One major obstacle towards AI is the poor ability of models to solve new...
research
10/25/2021

Mixture-of-Variational-Experts for Continual Learning

One significant shortcoming of machine learning is the poor ability of m...
research
09/18/2019

Continual learning: A comparative study on how to defy forgetting in classification tasks

Artificial neural networks thrive in solving the classification problem ...
research
02/14/2022

Memory Replay with Data Compression for Continual Learning

Continual learning needs to overcome catastrophic forgetting of the past...
research
09/09/2019

Efficient Continual Learning in Neural Networks with Embedding Regularization

Continual learning of deep neural networks is a key requirement for scal...

Please sign up or login with your details

Forgot password? Click here to reset