Stochastic Neural Networks with Infinite Width are Deterministic

01/30/2022
by   Liu Ziyin, et al.
8

This work theoretically studies stochastic neural networks, a main type of neural network in use. Specifically, we prove that as the width of an optimized stochastic neural network tends to infinity, its predictive variance on the training set decreases to zero. Two common examples that our theory applies to are neural networks with dropout and variational autoencoders. Our result helps better understand how stochasticity affects the learning of neural networks and thus design better architectures for practical problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2022

An Overview of Uncertainty Quantification Methods for Infinite Neural Networks

To better understand the theoretical behavior of large neural networks, ...
research
11/29/2021

Dependence between Bayesian neural network units

The connection between Bayesian neural networks and Gaussian processes g...
research
02/10/2022

Exact Solutions of a Deep Linear Network

This work finds the exact solutions to a deep linear network with weight...
research
05/02/2022

Triangular Dropout: Variable Network Width without Retraining

One of the most fundamental design choices in neural networks is layer w...
research
06/30/2020

Associative Memory in Iterated Overparameterized Sigmoid Autoencoders

Recent work showed that overparameterized autoencoders can be trained to...
research
11/11/2022

Do Bayesian Neural Networks Need To Be Fully Stochastic?

We investigate the efficacy of treating all the parameters in a Bayesian...
research
12/05/2016

Improving the Performance of Neural Networks in Regression Tasks Using Drawering

The method presented extends a given regression neural network to make i...

Please sign up or login with your details

Forgot password? Click here to reset