Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware

by   Peter U. Diehl, et al.

In recent years the field of neuromorphic low-power systems that consume orders of magnitude less power gained significant momentum. However, their wider use is still hindered by the lack of algorithms that can harness the strengths of such architectures. While neuromorphic adaptations of representation learning algorithms are now emerging, efficient processing of temporal sequences or variable length-inputs remain difficult. Recurrent neural networks (RNN) are widely used in machine learning to solve a variety of sequence learning tasks. In this work we present a train-and-constrain methodology that enables the mapping of machine learned (Elman) RNNs on a substrate of spiking neurons, while being compatible with the capabilities of current and near-future neuromorphic systems. This "train-and-constrain" method consists of first training RNNs using backpropagation through time, then discretizing the weights and finally converting them to spiking RNNs by matching the responses of artificial neurons with those of the spiking neurons. We demonstrate our approach by mapping a natural language processing task (question classification), where we demonstrate the entire mapping process of the recurrent layer of the network on IBM's Neurosynaptic System "TrueNorth", a spike-based digital neuromorphic hardware architecture. TrueNorth imposes specific constraints on connectivity, neural and synaptic parameters. To satisfy these constraints, it was necessary to discretize the synaptic weights and neural activities to 16 levels, and to limit fan-in to 64 inputs. We find that short synaptic delays are sufficient to implement the dynamical (temporal) aspect of the RNN in the question classification task. The hardware-constrained model achieved 74 0.025 consumption of 17 uW.


page 1

page 3


Spiking neural networks trained with backpropagation for low power neuromorphic implementation of voice activity detection

Recent advances in Voice Activity Detection (VAD) are driven by artifici...

A Digital Neuromorphic Architecture Efficiently Facilitating Complex Synaptic Response Functions Applied to Liquid State Machines

Information in neural networks is represented as weighted connections, o...

Control of criticality and computation in spiking neuromorphic networks with plasticity

The critical state is assumed to be optimal for any computation in recur...

Sub-mW Neuromorphic SNN audio processing applications with Rockpool and Xylo

Spiking Neural Networks (SNNs) provide an efficient computational mechan...

A neuromorphic boost to RNNs using low pass filters

The increasing difficulty with Moore's law scaling and the remarkable su...

Scaling Up Dynamic Graph Representation Learning via Spiking Neural Networks

Recent years have seen a surge in research on dynamic graph representati...

Mapping Generative Models onto a Network of Digital Spiking Neurons

Stochastic neural networks such as Restricted Boltzmann Machines (RBMs) ...

Please sign up or login with your details

Forgot password? Click here to reset