tinySNN: Towards Memory- and Energy-Efficient Spiking Neural Networks

Larger Spiking Neural Network (SNN) models are typically favorable as they can offer higher accuracy. However, employing such models on the resource- and energy-constrained embedded platforms is inefficient. Towards this, we present a tinySNN framework that optimizes the memory and energy requirements of SNN processing in both the training and inference phases, while keeping the accuracy high. It is achieved by reducing the SNN operations, improving the learning quality, quantizing the SNN parameters, and selecting the appropriate SNN model. Furthermore, our tinySNN quantizes different SNN parameters (i.e., weights and neuron parameters) to maximize the compression while exploring different combinations of quantization schemes, precision levels, and rounding schemes to find the model that provides acceptable accuracy. The experimental results demonstrate that our tinySNN significantly reduces the memory footprint and the energy consumption of SNNs without accuracy loss as compared to the baseline network. Therefore, our tinySNN effectively compresses the given SNN model to achieve high accuracy in a memory- and energy-efficient manner, hence enabling the employment of SNNs for the resource- and energy-constrained embedded applications.


Q-SpiNN: A Framework for Quantizing Spiking Neural Networks

A prominent technique for reducing the memory footprint of Spiking Neura...

FSpiNN: An Optimization Framework for Memory- and Energy-Efficient Spiking Neural Networks

Spiking Neural Networks (SNNs) are gaining interest due to their event-d...

Hardware/Software co-design with ADC-Less In-memory Computing Hardware for Spiking Neural Networks

Spiking Neural Networks (SNNs) are bio-plausible models that hold great ...

A Fully Spiking Hybrid Neural Network for Energy-Efficient Object Detection

This paper proposes a Fully Spiking Hybrid Neural Network (FSHNN) for en...

How to train accurate BNNs for embedded systems?

A key enabler of deploying convolutional neural networks on resource-con...

TopSpark: A Timestep Optimization Methodology for Energy-Efficient Spiking Neural Networks on Autonomous Mobile Agents

Autonomous mobile agents require low-power/energy-efficient machine lear...

Energy-Efficient Respiratory Anomaly Detection in Premature Newborn Infants

Precise monitoring of respiratory rate in premature infants is essential...

Please sign up or login with your details

Forgot password? Click here to reset