A binary-activation, multi-level weight RNN and training algorithm for processing-in-memory inference with eNVM

11/30/2019
by   Siming Ma, et al.
0

We present a new algorithm for training neural networks with binary activations and multi-level weights, which enables efficient processing-in-memory circuits with eNVM. Binary activations obviate costly DACs and ADCs. Multi-level weights leverage multi-level eNVM cells. Compared with previous quantization algorithms, our method not only works for feed-forward networks including fully-connected and convolutional, but also achieves higher accuracy and noise resilience for recurrent networks. In particular, we present a RNN trigger-word detection PIM accelerator, whose modeling results demonstrate high performance using our new training algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2020

Optimal Gradient Quantization Condition for Communication-Efficient Distributed Training

The communication of gradients is costly for training deep neural networ...
research
11/03/2017

ResBinNet: Residual Binary Neural Network

Recent efforts on training light-weight binary neural networks offer pro...
research
09/01/2015

A Telescopic Binary Learning Machine for Training Neural Networks

This paper proposes a new algorithm based on multi-scale stochastic loca...
research
03/16/2021

Training Dynamical Binary Neural Networks with Equilibrium Propagation

Equilibrium Propagation (EP) is an algorithm intrinsically adapted to th...
research
04/29/2012

Dissimilarity Clustering by Hierarchical Multi-Level Refinement

We introduce in this paper a new way of optimizing the natural extension...
research
06/21/2023

Synaptic metaplasticity with multi-level memristive devices

Deep learning has made remarkable progress in various tasks, surpassing ...
research
02/19/2022

Bit-wise Training of Neural Network Weights

We introduce an algorithm where the individual bits representing the wei...

Please sign up or login with your details

Forgot password? Click here to reset