Adaptive Synaptic Failure Enables Sampling from Posterior Predictive Distributions in the Brain

by   Kevin McKee, et al.

Bayesian interpretations of neural processing require that biological mechanisms represent and operate upon probability distributions in accordance with Bayes' theorem. Many have speculated that synaptic failure constitutes a mechanism of variational, i.e., approximate, Bayesian inference in the brain. Whereas models have previously used synaptic failure to sample over uncertainty in model parameters, we demonstrate that by adapting transmission probabilities to learned network weights, synaptic failure can sample not only over model uncertainty, but complete posterior predictive distributions as well. Our results potentially explain the brain's ability to perform probabilistic searches and to approximate complex integrals. These operations are involved in numerous calculations, including likelihood evaluation and state value estimation for complex planning.


Locally Learned Synaptic Dropout for Complete Bayesian Inference

The Bayesian brain hypothesis postulates that the brain accurately opera...

Network Plasticity as Bayesian Inference

General results from statistical learning theory suggest to understand n...

Expressive probabilistic sampling in recurrent neural networks

In sampling-based Bayesian models of brain function, neural activities a...

Neural Sampling Machine with Stochastic Synapse allows Brain-like Learning and Inference

Many real-world mission-critical applications require continual online l...

A learning framework for winner-take-all networks with stochastic synapses

Many recent generative models make use of neural networks to transform t...

Robust posterior inference when statistically emulating forward simulations

Scientific analyses often rely on slow, but accurate forward models for ...

Please sign up or login with your details

Forgot password? Click here to reset