Locally Learned Synaptic Dropout for Complete Bayesian Inference

11/18/2021
by   Kevin L. McKee, et al.
13

The Bayesian brain hypothesis postulates that the brain accurately operates on statistical distributions according to Bayes' theorem. The random failure of presynaptic vesicles to release neurotransmitters may allow the brain to sample from posterior distributions of network parameters, interpreted as epistemic uncertainty. It has not been shown previously how random failures might allow networks to sample from observed distributions, also known as aleatoric or residual uncertainty. Sampling from both distributions enables probabilistic inference, efficient search, and creative or generative problem solving. We demonstrate that under a population-code based interpretation of neural activity, both types of distribution can be represented and sampled with synaptic failure alone. We first define a biologically constrained neural network and sampling scheme based on synaptic failure and lateral inhibition. Within this framework, we derive drop-out based epistemic uncertainty, then prove an analytic mapping from synaptic efficacy to release probability that allows networks to sample from arbitrary, learned distributions represented by a receiving layer. Second, our result leads to a local learning rule by which synapses adapt their release probabilities. Our result demonstrates complete Bayesian inference, related to the variational learning method of dropout, in a biologically constrained network using only locally-learned synaptic failure rates.

READ FULL TEXT

page 5

page 6

page 18

page 19

research
10/04/2022

Adaptive Synaptic Failure Enables Sampling from Posterior Predictive Distributions in the Brain

Bayesian interpretations of neural processing require that biological me...
research
04/20/2015

Network Plasticity as Bayesian Inference

General results from statistical learning theory suggest to understand n...
research
06/16/2020

Posterior Network: Uncertainty Estimation without OOD Samples via Density-Based Pseudo-Counts

Accurate estimation of aleatoric and epistemic uncertainty is crucial to...
research
09/28/2020

Quantal synaptic dilution enhances sparse encoding and dropout regularisation in deep networks

Dropout is a technique that silences the activity of units stochasticall...
research
03/04/2018

Deep Network Regularization via Bayesian Inference of Synaptic Connectivity

Deep neural networks (DNNs) often require good regularizers to generaliz...
research
11/16/2019

Exploiting Oxide Based Resistive RAM Variability for Bayesian Neural Network Hardware Design

Uncertainty plays a key role in real-time machine learning. As a signifi...
research
11/16/2019

Exploiting Oxide Based Resistive RAM Variability for Probabilistic AI Hardware Design

Uncertainty plays a key role in real-time machine learning. As a signifi...

Please sign up or login with your details

Forgot password? Click here to reset