Deep Network Regularization via Bayesian Inference of Synaptic Connectivity

03/04/2018
by   Harris Partaourides, et al.
0

Deep neural networks (DNNs) often require good regularizers to generalize well. Currently, state-of-the-art DNN regularization techniques consist in randomly dropping units and/or connections on each iteration of the training algorithm. Dropout and DropConnect are characteristic examples of such regularizers, that are widely popular among practitioners. However, a drawback of such approaches consists in the fact that their postulated probability of random unit/connection omission is a constant that must be heuristically selected based on the obtained performance in some validation set. To alleviate this burden, in this paper we regard the DNN regularization problem from a Bayesian inference perspective: We impose a sparsity-inducing prior over the network synaptic weights, where the sparsity is induced by a set of Bernoulli-distributed binary variables with Beta (hyper-)priors over their prior parameters. This way, we eventually allow for marginalizing over the DNN synaptic connectivity for output generation, thus giving rise to an effective, heuristics-free, network regularization scheme. We perform Bayesian inference for the resulting hierarchical model by means of an efficient Black-Box Variational inference scheme. We exhibit the advantages of our method over existing approaches by conducting an extensive experimental evaluation using benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2019

MOPED: Efficient priors for scalable variational inference in Bayesian deep neural networks

Variational inference for Bayesian deep neural networks (DNNs) requires ...
research
05/16/2021

Bayesian reconstruction of memories stored in neural networks from their connectivity

The advent of comprehensive synaptic wiring diagrams of large neural cir...
research
05/19/2018

Nonparametric Bayesian Deep Networks with Local Competition

Local competition among neighboring neurons is a common procedure taking...
research
11/18/2021

Locally Learned Synaptic Dropout for Complete Bayesian Inference

The Bayesian brain hypothesis postulates that the brain accurately opera...
research
04/20/2015

Network Plasticity as Bayesian Inference

General results from statistical learning theory suggest to understand n...
research
10/10/2019

Rate Optimal Variational Bayesian Inference for Sparse DNN

Sparse deep neural network (DNN) has drawn much attention in recent stud...
research
11/17/2014

A Nonparametric Bayesian Approach Toward Stacked Convolutional Independent Component Analysis

Unsupervised feature learning algorithms based on convolutional formulat...

Please sign up or login with your details

Forgot password? Click here to reset