Layer Ensembles

10/10/2022
by   Illia Oleksiienko, et al.
8

Deep Ensembles, as a type of Bayesian Neural Networks, can be used to estimate uncertainty on the prediction of multiple neural networks by collecting votes from each network and computing the difference in those predictions. In this paper, we introduce a novel method for uncertainty estimation called Layer Ensembles that considers a set of independent categorical distributions for each layer of the network, giving many more possible samples with overlapped layers, than in the regular Deep Ensembles. We further introduce Optimized Layer Ensembles with an inference procedure that reuses common layer outputs, achieving up to 19x speed up and quadratically reducing memory usage. We also show that Layer Ensembles can be further improved by ranking samples, resulting in models that require less memory and time to run while achieving higher uncertainty quality than Deep Ensembles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2019

Deep Sub-Ensembles for Fast Uncertainty Estimation in Image Classification

Fast estimates of model uncertainty are required for many robust robotic...
research
02/22/2022

Confident Neural Network Regression with Bootstrapped Deep Ensembles

With the rise of the popularity and usage of neural networks, trustworth...
research
10/17/2022

Packed-Ensembles for Efficient Uncertainty Estimation

Deep Ensembles (DE) are a prominent approach achieving excellent perform...
research
08/31/2022

The Infinitesimal Jackknife and Combinations of Models

The Infinitesimal Jackknife is a general method for estimating variances...
research
10/18/2022

Disentangling the Predictive Variance of Deep Ensembles through the Neural Tangent Kernel

Identifying unfamiliar inputs, also known as out-of-distribution (OOD) d...
research
07/07/2023

DE-TGN: Uncertainty-Aware Human Motion Forecasting using Deep Ensembles

Ensuring the safety of human workers in a collaborative environment with...
research
06/24/2022

Out of distribution robustness with pre-trained Bayesian neural networks

We develop ShiftMatch, a new training-data-dependent likelihood for out ...

Please sign up or login with your details

Forgot password? Click here to reset