Robust uncertainty estimates with out-of-distribution pseudo-inputs training

01/15/2022
by   Pierre Segonne, et al.
0

Probabilistic models often use neural networks to control their predictive uncertainty. However, when making out-of-distribution (OOD) predictions, the often-uncontrollable extrapolation properties of neural networks yield poor uncertainty predictions. Such models then don't know what they don't know, which directly limits their robustness w.r.t unexpected inputs. To counter this, we propose to explicitly train the uncertainty predictor where we are not given data to make it reliable. As one cannot train without data, we provide mechanisms for generating pseudo-inputs in informative low-density regions of the input space, and show how to leverage these in a practical Bayesian framework that casts a prior distribution over the model uncertainty. With a holistic evaluation, we demonstrate that this yields robust and interpretable predictions of uncertainty while retaining state-of-the-art performance on diverse tasks such as regression and generative modelling

READ FULL TEXT

page 8

page 23

research
07/24/2018

Reliable Uncertainty Estimates in Deep Neural Networks using Noise Contrastive Priors

Obtaining reliable uncertainty estimates of neural network predictions i...
research
06/21/2023

Density Uncertainty Layers for Reliable Uncertainty Estimation

Assessing the predictive uncertainty of deep neural networks is crucial ...
research
05/29/2018

Lightweight Probabilistic Deep Networks

Even though probabilistic treatments of neural networks have a long hist...
research
02/13/2023

Probabilistic Circuits That Know What They Don't Know

Probabilistic circuits (PCs) are models that allow exact and tractable p...
research
07/28/2020

Toward Reliable Models for Authenticating Multimedia Content: Detecting Resampling Artifacts With Bayesian Neural Networks

In multimedia forensics, learning-based methods provide state-of-the-art...
research
05/21/2022

Transformer-based out-of-distribution detection for clinically safe segmentation

In a clinical setting it is essential that deployed image processing sys...
research
02/22/2021

Improving Uncertainty Calibration via Prior Augmented Data

Neural networks have proven successful at learning from complex data dis...

Please sign up or login with your details

Forgot password? Click here to reset