NAUTILUS: boosting Bayesian importance nested sampling with deep learning

06/29/2023
by   Johannes U. Lange, et al.
0

We introduce a novel approach to boost the efficiency of the importance nested sampling (INS) technique for Bayesian posterior and evidence estimation using deep learning. Unlike rejection-based sampling methods such as vanilla nested sampling (NS) or Markov chain Monte Carlo (MCMC) algorithms, importance sampling techniques can use all likelihood evaluations for posterior and evidence estimation. However, for efficient importance sampling, one needs proposal distributions that closely mimic the posterior distributions. We show how to combine INS with deep learning via neural network regression to accomplish this task. We also introduce NAUTILUS, a reference open-source Python implementation of this technique for Bayesian posterior and evidence estimation. We compare NAUTILUS against popular NS and MCMC packages, including EMCEE, DYNESTY, ULTRANEST and POCOMC, on a variety of challenging synthetic problems and real-world applications in exoplanet detection, galaxy SED fitting and cosmology. In all applications, the sampling efficiency of NAUTILUS is substantially higher than that of all other samplers, often by more than an order of magnitude. Simultaneously, NAUTILUS delivers highly accurate results and needs fewer likelihood evaluations than all other samplers tested. We also show that NAUTILUS has good scaling with the dimensionality of the likelihood and is easily parallelizable to many CPUs.

READ FULL TEXT
research
04/03/2019

dynesty: A Dynamic Nested Sampling Package for Estimating Bayesian Posteriors and Evidences

We present dynesty, a public, open-source, Python package to estimate Ba...
research
10/11/2022

Neural Importance Sampling for Rapid and Reliable Gravitational-Wave Inference

We combine amortized neural posterior estimation with importance samplin...
research
12/12/2019

Normalizing Constant Estimation with Gaussianized Bridge Sampling

Normalizing constant (also called partition function, Bayesian evidence,...
research
03/11/2019

Embarrassingly parallel MCMC using deep invertible transformations

While MCMC methods have become a main work-horse for Bayesian inference,...
research
11/22/2019

Importance Sampling of Many Lights with Reinforcement Lightcuts Learning

In this manuscript, we introduce a novel technique for sampling and inte...
research
05/04/2023

Quantile Importance Sampling

In Bayesian inference, the approximation of integrals of the form ψ = 𝔼_...
research
05/06/2021

MCMC-driven importance samplers

Monte Carlo methods are the standard procedure for estimating complicate...

Please sign up or login with your details

Forgot password? Click here to reset