Adaptive Reorganization of Neural Pathways for Continual Learning with Hybrid Spiking Neural Networks

by   Bing Han, et al.

The human brain can self-organize rich and diverse sparse neural pathways to incrementally master hundreds of cognitive tasks. However, most existing continual learning algorithms for deep artificial and spiking neural networks are unable to adequately auto-regulate the limited resources in the network, which leads to performance drop along with energy consumption rise as the increase of tasks. In this paper, we propose a brain-inspired continual learning algorithm with adaptive reorganization of neural pathways, which employs Self-Organizing Regulation networks to reorganize the single and limited Spiking Neural Network (SOR-SNN) into rich sparse neural pathways to efficiently cope with incremental tasks. The proposed model demonstrates consistent superiority in performance, energy consumption, and memory capacity on diverse continual learning tasks ranging from child-like simple to complex tasks, as well as on generalized CIFAR100 and ImageNet datasets. In particular, the SOR-SNN model excels at learning more complex tasks as well as more tasks, and is able to integrate the past learned knowledge with the information from the current task, showing the backward transfer ability to facilitate the old tasks. Meanwhile, the proposed model exhibits self-repairing ability to irreversible damage and for pruned networks, could automatically allocate new pathway from the retained network to recover memory for forgotten knowledge.


page 4

page 6


Enhancing Efficient Continual Learning with Dynamic Structure Development of Spiking Neural Networks

Children possess the ability to learn multiple cognitive tasks sequentia...

Triple Memory Networks: a Brain-Inspired Method for Continual Learning

Continual acquisition of novel experience without interfering previously...

Emergence of Brain-inspired Small-world Spiking Neural Network through Neuroevolution

Human brain is the product of evolution during hundreds over millions of...

Bayesian Continual Learning via Spiking Neural Networks

Among the main features of biological intelligence are energy efficiency...

Training Spiking Neural Networks for Cognitive Tasks: A Versatile Framework Compatible to Various Temporal Codes

Conventional modeling approaches have found limitations in matching the ...

Effective prevention of semantic drift as angular distance in memory-less continual deep neural networks

Lifelong machine learning or continual learning models attempt to learn ...

Please sign up or login with your details

Forgot password? Click here to reset