Nonparametric Bayesian Deep Networks with Local Competition

by   Konstantinos P. Panousis, et al.

Local competition among neighboring neurons is a common procedure taking place in biological systems. This finding has inspired research on more biologically plausible deep networks that comprise competing linear units; such models can be effectively trained by means of gradient-based backpropagation. This is in contrast to traditional deep networks, built of nonlinear units that do not entail any form of (local) competition. However, for the case of competition-based networks, the problem of data-driven inference of their most appropriate configuration, including the needed number of connections or locally competing sets of units, has not been touched upon by the research community. This work constitutes the first attempt to address this shortcoming; to this end, we leverage solid arguments from the field of Bayesian nonparametrics. Specifically, we introduce auxiliary discrete latent variables of model component utility, and perform Bayesian inference over them. We impose appropriate stick-breaking priors over the introduced discrete latent variables; these give rise to an well-established sparsity-inducing mechanism. We devise efficient inference algorithms for our model by resorting to stochastic gradient variational Bayes. We perform an extensive experimental evaluation of our approach using benchmark data. Our results verify that we obtain state-of-the-art accuracy albeit via networks of much smaller memory and computational footprint than the competition.


page 1

page 2

page 3

page 4


Local Competition and Stochasticity for Adversarial Robustness in Deep Learning

This work addresses adversarial robustness in deep learning by consideri...

Competing Mutual Information Constraints with Stochastic Competition-based Activations for Learning Diversified Representations

This work aims to address the long-established problem of learning diver...

Recurrent Latent Variable Networks for Session-Based Recommendation

In this work, we attempt to ameliorate the impact of data sparsity in th...

Deep Network Regularization via Bayesian Inference of Synaptic Connectivity

Deep neural networks (DNNs) often require good regularizers to generaliz...

Local Competition and Uncertainty for Adversarial Robustness in Deep Learning

This work attempts to address adversarial robustness of deep networks by...

Linear-Nonlinear-Poisson Neuron Networks Perform Bayesian Inference On Boltzmann Machines

One conjecture in both deep learning and classical connectionist viewpoi...

Stochastic Deep Networks with Linear Competing Units for Model-Agnostic Meta-Learning

This work addresses meta-learning (ML) by considering deep networks with...

Please sign up or login with your details

Forgot password? Click here to reset