Minimal Effort Back Propagation for Convolutional Neural Networks

09/18/2017
by   Bingzhen Wei, et al.
0

As traditional neural network consumes a significant amount of computing resources during back propagation, Sun2017mePropSB propose a simple yet effective technique to alleviate this problem. In this technique, only a small subset of the full gradients are computed to update the model parameters. In this paper we extend this technique into the Convolutional Neural Network(CNN) to reduce calculation in back propagation, and the surprising results verify its validity in CNN: only 5% of the gradients are passed back but the model still achieves the same effect as the traditional CNN, or even better. We also show that the top-k selection of gradients leads to a sparse calculation in back propagation, which may bring significant computational benefits for high computational complexity of convolution operation in CNN.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset