Continual learning under domain transfer with sparse synaptic bursting

by   Shawn L. Beaulieu, et al.

Existing machines are functionally specific tools that were made for easy prediction and control. Tomorrow's machines may be closer to biological systems in their mutability, resilience, and autonomy. But first they must be capable of learning, and retaining, new information without repeated exposure to it. Past efforts to engineer such systems have sought to build or regulate artificial neural networks using task-specific modules with constrained circumstances of application. This has not yet enabled continual learning over long sequences of previously unseen data without corrupting existing knowledge: a problem known as catastrophic forgetting. In this paper, we introduce a system that can learn sequentially over previously unseen datasets (ImageNet, CIFAR-100) with little forgetting over time. This is accomplished by regulating the activity of weights in a convolutional neural network on the basis of inputs using top-down modulation generated by a second feed-forward neural network. We find that our method learns continually under domain transfer with sparse bursts of activity in weights that are recycled across tasks, rather than by maintaining task-specific modules. Sparse synaptic bursting is found to balance enhanced and diminished activity in a way that facilitates adaptation to new inputs without corrupting previously acquired functions. This behavior emerges during a prior meta-learning phase in which regulated synapses are selectively disinhibited, or grown, from an initial state of uniform suppression.


page 15

page 16

page 17

page 18

page 19

page 20

page 23

page 24


Continual Reinforcement Learning with Complex Synapses

Unlike humans, who are capable of continual learning over their lifetime...

Learning to modulate random weights can induce task-specific contexts for economical meta and continual learning

Neural networks are vulnerable to catastrophic forgetting when data dist...

Bio-Inspired, Task-Free Continual Learning through Activity Regularization

The ability to sequentially learn multiple tasks without forgetting is a...

Continual learning benefits from multiple sleep mechanisms: NREM, REM, and Synaptic Downscaling

Learning new tasks and skills in succession without losing prior learnin...

Hierarchically Structured Task-Agnostic Continual Learning

One notable weakness of current machine learning algorithms is the poor ...

HAT-CL: A Hard-Attention-to-the-Task PyTorch Library for Continual Learning

Catastrophic forgetting, the phenomenon in which a neural network loses ...

Enabling Continual Learning with Differentiable Hebbian Plasticity

Continual learning is the problem of sequentially learning new tasks or ...

Please sign up or login with your details

Forgot password? Click here to reset