Encoding prior knowledge in the structure of the likelihood

12/11/2018
by   Jakob Knollmüller, et al.
0

The inference of deep hierarchical models is problematic due to strong dependencies between the hierarchies. We investigate a specific transformation of the model parameters based on the multivariate distributional transform. This transformation is a special form of the reparametrization trick, flattens the hierarchy and leads to a standard Gaussian prior on all resulting parameters. The transformation also transfers all the prior information into the structure of the likelihood, hereby decoupling the transformed parameters a priori from each other. A variational Gaussian approximation in this standardized space will be excellent in situations of relatively uninformative data. Additionally, the curvature of the log-posterior is well-conditioned in directions that are weakly constrained by the data, allowing for fast inference in such a scenario. In an example we perform the transformation explicitly for Gaussian process regression with a priori unknown correlation structure. Deep models are inferred rapidly in highly and slowly in poorly informed situations. The flat model show exactly the opposite performance pattern. A synthesis of both, the deep and the flat perspective, provides their combined advantages and overcomes the individual limitations, leading to a faster inference.

READ FULL TEXT
research
03/25/2020

Scalable Variational Gaussian Process Regression Networks

Gaussian process regression networks (GPRN) are powerful Bayesian models...
research
12/05/2018

Bayesian Spatial Inversion and Conjugate Selection Gaussian Prior Models

We introduce the concept of conjugate prior models for a given likelihoo...
research
06/07/2019

Multivariate Conditional Transformation Models

Regression models describing the joint distribution of multivariate resp...
research
05/24/2018

Log Gaussian Cox Process Networks

We generalize the log Gaussian Cox process (LGCP) framework to model mul...
research
06/24/2020

Likelihood-Free Gaussian Process for Regression

Gaussian process regression can flexibly represent the posterior distrib...
research
09/25/2017

A general framework for uncertainty quantification under non-Gaussian input dependencies

Uncertainty quantification (UQ) deals with the estimation of statistics ...
research
06/29/2020

Kendall transformation

Kendall transformation is a conversion of an ordered feature into a vect...

Please sign up or login with your details

Forgot password? Click here to reset