Enhancing Efficient Continual Learning with Dynamic Structure Development of Spiking Neural Networks

by   Bing Han, et al.

Children possess the ability to learn multiple cognitive tasks sequentially, which is a major challenge toward the long-term goal of artificial general intelligence. Existing continual learning frameworks are usually applicable to Deep Neural Networks (DNNs) and lack the exploration on more brain-inspired, energy-efficient Spiking Neural Networks (SNNs). Drawing on continual learning mechanisms during child growth and development, we propose Dynamic Structure Development of Spiking Neural Networks (DSD-SNN) for efficient and adaptive continual learning. When learning a sequence of tasks, the DSD-SNN dynamically assigns and grows new neurons to new tasks and prunes redundant neurons, thereby increasing memory capacity and reducing computational overhead. In addition, the overlapping shared structure helps to quickly leverage all acquired knowledge to new tasks, empowering a single network capable of supporting multiple incremental tasks (without the separate sub-network mask for each task). We validate the effectiveness of the proposed model on multiple class incremental learning and task incremental learning benchmarks. Extensive experiments demonstrated that our model could significantly improve performance, learning speed and memory capacity, and reduce computational overhead. Besides, our DSD-SNN model achieves comparable performance with the DNNs-based methods, and significantly outperforms the state-of-the-art (SOTA) performance for existing SNNs-based continual learning methods.


page 1

page 2

page 3

page 4


Adaptive Reorganization of Neural Pathways for Continual Learning with Hybrid Spiking Neural Networks

The human brain can self-organize rich and diverse sparse neural pathway...

Embodied Continual Learning Across Developmental Time Via Developmental Braitenberg Vehicles

There is much to learn through synthesis of Developmental Biology, Cogni...

Bayesian Continual Learning via Spiking Neural Networks

Among the main features of biological intelligence are energy efficiency...

A Microarchitecture Implementation Framework for Online Learning with Temporal Neural Networks

Temporal Neural Networks (TNNs) are spiking neural networks that use tim...

Continual Learning with Deep Learning Methods in an Application-Oriented Context

Abstract knowledge is deeply grounded in many computer-based application...

Continual Learning with Deep Artificial Neurons

Neurons in real brains are enormously complex computational units. Among...

Please sign up or login with your details

Forgot password? Click here to reset