Spiking Neural Networks with Improved Inherent Recurrence Dynamics for Sequential Learning

by   Wachirawit Ponghiran, et al.

Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons, can be operated in an event-driven manner and have internal states to retain information over time, providing opportunities for energy-efficient neuromorphic computing, especially on edge devices. Note, however, many representative works on SNNs do not fully demonstrate the usefulness of their inherent recurrence (membrane potentials retaining information about the past) for sequential learning. Most of the works train SNNs to recognize static images by artificially expanded input representation in time through rate coding. We show that SNNs can be trained for sequential tasks and propose modifications to a network of LIF neurons that enable internal states to learn long sequences and make their inherent recurrence resilient to the vanishing gradient problem. We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics. Our training scheme allows spiking neurons to produce multi-bit outputs (as opposed to binary spikes) which help mitigate the mismatch between a derivative of spiking neurons' activation function and a surrogate derivative used to overcome spiking neurons' non-differentiability. Our experimental results indicate that the proposed SNN architecture on TIMIT and LibriSpeech 100h dataset yields accuracy comparable to that of LSTMs (within 1.10 parameters than LSTMs. The sparse SNN outputs also lead to 10.13x and 11.14x savings in multiplication operations compared to GRUs, which is generally con-sidered as a lightweight alternative to LSTMs, on TIMIT and LibriSpeech 100h datasets, respectively.


page 1

page 3

page 4

page 6

page 9

page 11

page 12

page 14


Gradient-based Neuromorphic Learning on Dynamical RRAM Arrays

We present MEMprop, the adoption of gradient-based learning to train ful...

Accelerating SNN Training with Stochastic Parallelizable Spiking Neurons

Spiking neural networks (SNN) are able to learn spatiotemporal features ...

Action Recognition Using Supervised Spiking Neural Networks

Biological neurons use spikes to process and learn temporally dynamic in...

Towards Energy-Efficient, Low-Latency and Accurate Spiking LSTMs

Spiking Neural Networks (SNNs) have emerged as an attractive spatio-temp...

Surrogate Gradient Spiking Neural Networks as Encoders for Large Vocabulary Continuous Speech Recognition

Compared to conventional artificial neurons that produce dense and real-...

SpikingBERT: Distilling BERT to Train Spiking Language Models Using Implicit Differentiation

Large language Models (LLMs), though growing exceedingly powerful, compr...

Please sign up or login with your details

Forgot password? Click here to reset