Hamiltonian Monte Carlo using an embedded Laplace approximation

by   Charles C. Margossian, et al.

Latent Gaussian models are a popular class of hierarchical models with applications in many fields. Performing Bayesian inference on such models can be challenging. Markov chain Monte Carlo algorithms struggle with the geometry of the resulting posterior distribution and can be prohibitively slow. An alternative is to use an integrated nested Laplace approximation, whereby we marginalize out the latent Gaussian variables and estimate the hyperparameters with a deterministic scheme. This type of inference typically only works for low dimensional and unimodal hyperparameters. We bypass these limitations by coupling dynamic Hamiltonian Monte Carlo with an embedded Laplace approximation. Our implementation features a novel adjoint method to differentiate the marginal likelihood, which scales with high dimensional hyperparameters. We prototype the method in the probabilistic programming framework Stan and test the utility of the embedded Laplace approximation on several models: a classic Gaussian process, a general linear regression model with a sparsity inducing horseshoe prior, and a sparse kernel interaction model. The last two models are characterized by a high dimensional and a multimodal posterior distribution of the hyperparameters, and as such present novel applications of the embedded Laplace approximation. Depending on the cases, the benefits are either a dramatic speed-up, or an alleviation of the geometric pathologies that frustrate Hamiltonian Monte Carlo.


page 1

page 2

page 3

page 4


Hamiltonian Monte Carlo using an adjoint-differentiated Laplace approximation

Gaussian latent variable models are a key class of Bayesian hierarchical...

Laplace approximation for logistic Gaussian process density estimation and regression

Logistic Gaussian process (LGP) priors provide a flexible alternative fo...

Penalty parameter selection and asymmetry corrections to Laplace approximations in Bayesian P-splines models

Laplacian-P-splines (LPS) associate the P-splines smoother and the Lapla...

General adjoint-differentiated Laplace approximation

The hierarchical prior used in Latent Gaussian models (LGMs) induces a p...

Approximate Bayesian Inference via Sparse grid Quadrature Evaluation for Hierarchical Models

We combine conditioning techniques with sparse grid quadrature rules to ...

Hamiltonian Monte Carlo for Regression with High-Dimensional Categorical Data

Latent variable models are becoming increasingly popular in economics fo...

Gaussian Process Regression with a Student-t Likelihood

This paper considers the robust and efficient implementation of Gaussian...

Please sign up or login with your details

Forgot password? Click here to reset