Parameter Compression of Recurrent Neural Networks and Degradation of Short-term Memory

12/02/2016
by   Jonathan A. Cox, et al.
0

The significant computational costs of deploying neural networks in large-scale or resource constrained environments, such as data centers and mobile devices, has spurred interest in model compression, which can achieve a reduction in both arithmetic operations and storage memory. Several techniques have been proposed for reducing or compressing the parameters for feed-forward and convolutional neural networks, but less is understood about the effect of parameter compression on recurrent neural networks (RNN). In particular, the extent to which the recurrent parameters can be compressed and the impact on short-term memory performance, is not well understood. In this paper, we study the effect of complexity reduction, through singular value decomposition rank reduction, on RNN and minimal gated recurrent unit (MGRU) networks for several tasks. We show that considerable rank reduction is possible when compressing recurrent weights, even without fine tuning. Furthermore, we propose a perturbation model for the effect of general perturbations, such as a compression, on the recurrent parameters of RNNs. The model is tested against a noiseless memorization experiment that elucidates the short-term memory performance. In this way, we demonstrate that the effect of compression of recurrent parameters is dependent on the degree of temporal coherence present in the data and task. This work can guide on-the-fly RNN compression for novel environments or tasks, and provides insight for applying RNN compression in low-power devices, such as hearing aids.

READ FULL TEXT

page 4

page 5

research
03/25/2016

On the Compression of Recurrent Neural Networks with an Application to LVCSR acoustic modeling for Embedded Speech Recognition

We study the problem of compressing recurrent neural networks (RNNs). In...
research
04/09/2016

Learning Compact Recurrent Neural Networks

Recurrent neural networks (RNNs), including long short-term memory (LSTM...
research
02/06/2019

Compression of Recurrent Neural Networks for Efficient Language Modeling

Recurrent neural networks have proved to be an effective method for stat...
research
06/09/2020

Tensor train decompositions on recurrent networks

Recurrent neural networks (RNN) such as long-short-term memory (LSTM) ne...
research
10/04/2019

Pushing the limits of RNN Compression

Recurrent Neural Networks (RNN) can be difficult to deploy on resource c...
research
02/02/2018

Short-term Memory of Deep RNN

The extension of deep learning towards temporal data processing is gaini...
research
10/03/2020

A Variational Information Bottleneck Based Method to Compress Sequential Networks for Human Action Recognition

In the last few years, compression of deep neural networks has become an...

Please sign up or login with your details

Forgot password? Click here to reset