Regularization of Deep Neural Networks with Spectral Dropout

11/23/2017
by   Salman Khan, et al.
0

The big breakthrough on the ImageNet challenge in 2012 was partially due to the `dropout' technique used to avoid overfitting. Here, we introduce a new approach called `Spectral Dropout' to improve the generalization ability of deep neural networks. We cast the proposed approach in the form of regular Convolutional Neural Network (CNN) weight layers using a decorrelation transform with fixed basis functions. Our spectral dropout method prevents overfitting by eliminating weak and `noisy' Fourier domain coefficients of the neural network activations, leading to remarkably better results than the current regularization methods. Furthermore, the proposed is very efficient due to the fixed basis functions used for spectral transformation. In particular, compared to Dropout and Drop-Connect, our method significantly speeds up the network convergence rate during the training process (roughly x2), with considerably higher neuron pruning rates (an increase of 30 that the spectral dropout can also be used in conjunction with other regularization approaches resulting in additional performance gains.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2019

Ising-Dropout: A Regularization Method for Training and Compression of Deep Neural Networks

Overfitting is a major problem in training machine learning models, spec...
research
04/25/2019

Survey of Dropout Methods for Deep Neural Networks

Dropout methods are a family of stochastic techniques used in neural net...
research
12/20/2014

Neural Network Regularization via Robust Weight Factorization

Regularization is essential when training large neural networks. As deep...
research
12/15/2016

Improving Neural Network Generalization by Combining Parallel Circuits with Dropout

In an attempt to solve the lengthy training times of neural networks, we...
research
12/04/2017

Data Dropout in Arbitrary Basis for Deep Network Regularization

An important problem in training deep networks with high capacity is to ...
research
04/06/2023

Spectral Gap Regularization of Neural Networks

We introduce Fiedler regularization, a novel approach for regularizing n...
research
03/01/2021

LocalDrop: A Hybrid Regularization for Deep Neural Networks

In neural networks, developing regularization algorithms to settle overf...

Please sign up or login with your details

Forgot password? Click here to reset