Continual Learning: Fast and Slow

09/06/2022
by   Quang Pham, et al.
0

According to the Complementary Learning Systems (CLS) theory <cit.> in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose DualNets (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL <cit.> benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies <cit.>. Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability. Code is publicly available at <https://github.com/phquang/DualNet>.

READ FULL TEXT

page 2

page 5

page 16

research
10/01/2021

DualNet: Continual Learning, Fast and Slow

According to Complementary Learning Systems (CLS) theory <cit.> in neuro...
research
01/29/2022

Learning Fast, Learning Slow: A General Continual Learning Method based on Complementary Learning System

Humans excel at continually learning from an ever-changing environment w...
research
06/01/2022

Label-Efficient Online Continual Object Detection in Streaming Video

To thrive in evolving environments, humans are capable of continual acqu...
research
04/12/2021

Continual Learning for Text Classification with Information Disentanglement Based Regularization

Continual learning has become increasingly important as it enables NLP m...
research
06/26/2021

Continual Learning via Inter-Task Synaptic Mapping

Learning from streaming tasks leads a model to catastrophically erase un...
research
07/29/2021

Few-Shot and Continual Learning with Attentive Independent Mechanisms

Deep neural networks (DNNs) are known to perform well when deployed to t...
research
03/22/2022

Meta-attention for ViT-backed Continual Learning

Continual learning is a longstanding research topic due to its crucial r...

Please sign up or login with your details

Forgot password? Click here to reset