Bayesian Nested Neural Networks for Uncertainty Calibration and Adaptive Compression

by   Yufei Cui, et al.

Nested networks or slimmable networks are neural networks whose architectures can be adjusted instantly during testing time, e.g., based on computational constraints. Recent studies have focused on a "nested dropout" layer, which is able to order the nodes of a layer by importance during training, thus generating a nested set of sub-networks that are optimal for different configurations of resources. However, the dropout rate is fixed as a hyper-parameter over different layers during the whole training process. Therefore, when nodes are removed, the performance decays in a human-specified trajectory rather than in a trajectory learned from data. Another drawback is the generated sub-networks are deterministic networks without well-calibrated uncertainty. To address these two problems, we develop a Bayesian approach to nested neural networks. We propose a variational ordering unit that draws samples for nested dropout at a low cost, from a proposed Downhill distribution, which provides useful gradients to the parameters of nested dropout. Based on this approach, we design a Bayesian nested neural network that learns the order knowledge of the node distributions. In experiments, we show that the proposed approach outperforms the nested network in terms of accuracy, calibration, and out-of-domain detection in classification tasks. It also outperforms the related approach on uncertainty-critical tasks in computer vision.


page 1

page 2

page 3

page 4


Learning Compact Convolutional Neural Networks with Nested Dropout

Recently, nested dropout was proposed as a method for ordering represent...

Nested multi-instance classification

There are classification tasks that take as inputs groups of images rath...

Multitask Learning Deep Neural Network to Combine Revealed and Stated Preference Data

It is an enduring question how to combine revealed preference (RP) and s...

Out of Distribution Data Detection Using Dropout Bayesian Neural Networks

We explore the utility of information contained within a dropout based B...

On the Calibration of Nested Dichotomies for Large Multiclass Tasks

Nested dichotomies are used as a method of transforming a multiclass cla...

Learning Nested Sparse Structures in Deep Neural Networks

Recently, there have been increasing demands to construct compact deep a...

Parameters Estimation for the Cosmic Microwave Background with Bayesian Neural Networks

In this paper, we present the first study that compares different models...

Please sign up or login with your details

Forgot password? Click here to reset