Technical report: supervised training of convolutional spiking neural networks with PyTorch

by   Romain Zimmer, et al.

Recently, it has been shown that spiking neural networks (SNNs) can be trained efficiently, in a supervised manner, using backpropagation through time. Indeed, the most commonly used spiking neuron model, the leaky integrate-and-fire neuron, obeys a differential equation which can be approximated using discrete time steps, leading to a recurrent relation for the potential. The firing threshold causes optimization issues, but they can be overcome using a surrogate gradient. Here, we extend previous approaches in two ways. Firstly, we show that the approach can be used to train convolutional layers. Convolutions can be done in space, time (which simulates conduction delays), or both. Secondly, we include fast horizontal connections à la Denève: when a neuron N fires, we subtract to the potentials of all the neurons with the same receptive the dot product between their weight vectors and the one of neuron N. As Denève et al. showed, this is useful to represent a dynamic multidimensional analog signal in a population of spiking neurons. Here we demonstrate that, in addition, such connections also allow implementing a multidimensional send-on-delta coding scheme. We validate our approach on one speech classification benchmarks: the Google speech command dataset. We managed to reach nearly state-of-the-art accuracy (94 rates (about 5Hz). Our code is based on PyTorch and is available in open source at


Low-activity supervised convolutional spiking neural networks applied to speech commands recognition

Deep Neural Networks (DNNs) are the current state-of-the-art models in m...

Learning Delays in Spiking Neural Networks using Dilated Convolutions with Learnable Spacings

Spiking Neural Networks (SNNs) are a promising research direction for bu...

L4-Norm Weight Adjustments for Converted Spiking Neural Networks

Spiking Neural Networks (SNNs) are being explored for their potential en...

GLIF: A Unified Gated Leaky Integrate-and-Fire Neuron for Spiking Neural Networks

Spiking Neural Networks (SNNs) have been studied over decades to incorpo...

Developing a supervised training algorithm for limited precision feed-forward spiking neural networks

Spiking neural networks have been referred to as the third generation of...

Sparse Distributed Memory using Spiking Neural Networks on Nengo

We present a Spiking Neural Network (SNN) based Sparse Distributed Memor...

Efficient Privacy-Preserving Convolutional Spiking Neural Networks with FHE

With the rapid development of AI technology, we have witnessed numerous ...

Please sign up or login with your details

Forgot password? Click here to reset