Uncertainty-Wizard: Fast and User-Friendly Neural Network Uncertainty Quantification

12/29/2020
by   Michael Weiss, et al.
0

Uncertainty and confidence have been shown to be useful metrics in a wide variety of techniques proposed for deep learning testing, including test data selection and system supervision.We present uncertainty-wizard, a tool that allows to quantify such uncertainty and confidence in artificial neural networks. It is built on top of the industry-leading tf.keras deep learning API and it provides a near-transparent and easy to understand interface. At the same time, it includes major performance optimizations that we benchmarked on two different machines and different configurations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2020

Uncertainty Quantification for Inferring Hawkes Networks

Multivariate Hawkes processes are commonly used to model streaming netwo...
research
06/01/2023

A General Framework for Uncertainty Quantification via Neural SDE-RNN

Uncertainty quantification is a critical yet unsolved challenge for deep...
research
04/30/2019

Test Selection for Deep Learning Systems

Testing of deep learning models is challenging due to the excessive numb...
research
06/09/2023

Efficient Uncertainty Quantification and Reduction for Over-Parameterized Neural Networks

Uncertainty quantification (UQ) is important for reliability assessment ...
research
12/24/2022

Improving Uncertainty Quantification of Variance Networks by Tree-Structured Learning

To improve uncertainty quantification of variance networks, we propose a...
research
06/02/2021

Evidential Turing Processes

A probabilistic classifier with reliable predictive uncertainties i) fit...
research
04/07/2018

ANNETT-O: An Ontology for Describing Artificial Neural Network Evaluation, Topology and Training

Deep learning models, while effective and versatile, are becoming increa...

Please sign up or login with your details

Forgot password? Click here to reset