GP-select: Accelerating EM using adaptive subspace preselection

12/10/2014
by   Jacquelyn A. Shelton, et al.
0

We propose a nonparametric procedure to achieve fast inference in generative graphical models when the number of latent states is very large. The approach is based on iterative latent variable preselection, where we alternate between learning a 'selection function' to reveal the relevant latent variables, and use this to obtain a compact approximation of the posterior distribution for EM; this can make inference possible where the number of possible latent states is e.g. exponential in the number of latent variables, whereas an exact approach would be computationally unfeasible. We learn the selection function entirely from the observed data and current EM state via Gaussian process regression. This is by contrast with earlier approaches, where selection functions were manually-designed for each problem setting. We show that our approach performs as well as these bespoke selection functions on a wide variety of inference problems: in particular, for the challenging case of a hierarchical model for object localization with occlusion, we achieve results that match a customized state-of-the-art selection method, at a far lower computational cost.

READ FULL TEXT

page 22

page 24

page 25

research
08/09/2014

Gaussian Process Structural Equation Models with Latent Variables

In a variety of disciplines such as social sciences, psychology, medicin...
research
08/21/2022

A Graphical Model for Fusing Diverse Microbiome Data

This paper develops a Bayesian graphical model for fusing disparate type...
research
02/22/2020

Amortised Learning by Wake-Sleep

Models that employ latent variables to capture structure in observed dat...
research
07/01/2013

Dimensionality Detection and Integration of Multiple Data Sources via the GP-LVM

The Gaussian Process Latent Variable Model (GP-LVM) is a non-linear prob...
research
02/07/2022

Gaussian Graphical Models as an Ensemble Method for Distributed Gaussian Processes

Distributed Gaussian process (DGP) is a popular approach to scale GP to ...
research
05/18/2020

Deep Latent-Variable Kernel Learning

Deep kernel learning (DKL) leverages the connection between Gaussian pro...
research
11/20/2020

Lightweight Data Fusion with Conjugate Mappings

We present an approach to data fusion that combines the interpretability...

Please sign up or login with your details

Forgot password? Click here to reset