Quantal synaptic dilution enhances sparse encoding and dropout regularisation in deep networks

by   Gardave S Bhumbra, et al.

Dropout is a technique that silences the activity of units stochastically while training deep networks to reduce overfitting. Here we introduce Quantal Synaptic Dilution (QSD), a biologically plausible model of dropout regularisation based on the quantal properties of neuronal synapses, that incorporates heterogeneities in response magnitudes and release probabilities for vesicular quanta. QSD outperforms standard dropout in ReLU multilayer perceptrons, with enhanced sparse encoding at test time when dropout masks are replaced with identity functions, without shifts in trainable weight or bias distributions. For convolutional networks, the method also improves generalisation in computer vision tasks with and without inclusion of additional forms of regularisation. QSD also outperforms standard dropout in recurrent networks for language modelling and sentiment analysis. An advantage of QSD over many variations of dropout is that it can be implemented generally in all conventional deep networks where standard dropout is applicable.


page 1

page 2

page 3

page 4


Surprising properties of dropout in deep networks

We analyze dropout in deep networks with rectified linear units and the ...

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

Recurrent neural networks (RNNs) stand at the forefront of many recent d...

Input Dropout for Spatially Aligned Modalities

Computer vision datasets containing multiple modalities such as color, d...

Self-Balanced Dropout

Dropout is known as an effective way to reduce overfitting via preventin...

Locally Learned Synaptic Dropout for Complete Bayesian Inference

The Bayesian brain hypothesis postulates that the brain accurately opera...

Dropout Inference with Non-Uniform Weight Scaling

Dropout as regularization has been used extensively to prevent overfitti...

Please sign up or login with your details

Forgot password? Click here to reset