Deep Neural Networks with Short Circuits for Improved Gradient Learning

by   Ming Yan, et al.

Deep neural networks have achieved great success both in computer vision and natural language processing tasks. However, mostly state-of-art methods highly rely on external training or computing to improve the performance. To alleviate the external reliance, we proposed a gradient enhancement approach, conducted by the short circuit neural connections, to improve the gradient learning of deep neural networks. The proposed short circuit is a unidirectional connection that single back propagates the sensitive from the deep layer to the shallows. Moreover, the short circuit formulates to be a gradient truncation of its crossing layers which can plug into the backbone deep neural networks without introducing external training parameters. Extensive experiments demonstrate deep neural networks with our short circuit gain a large margin over the baselines on both computer vision and natural language processing tasks.


page 1

page 2

page 3

page 5

page 7

page 8

page 9

page 11


Generalization Error in Deep Learning

Deep learning models have lately shown great performance in various fiel...

Learning Boolean Circuits with Neural Networks

Training neural-networks is computationally hard. However, in practice t...

DecisiveNets: Training Deep Associative Memories to Solve Complex Machine Learning Problems

Learning deep representations to solve complex machine learning tasks ha...

Automatic annotation of visual deep neural networks

Computer vision is widely used in the fields of driverless, face recogni...

Learning with Retrospection

Deep neural networks have been successfully deployed in various domains ...

Interpreting Deep Neural Networks Through Variable Importance

While the success of deep neural networks (DNNs) is well-established acr...

Please sign up or login with your details

Forgot password? Click here to reset