Probabilistic Circuits for Variational Inference in Discrete Graphical Models

10/22/2020
by   Andy Shih, et al.
0

Inference in discrete graphical models with variational methods is difficult because of the inability to re-parameterize gradients of the Evidence Lower Bound (ELBO). Many sampling-based methods have been proposed for estimating these gradients, but they suffer from high bias or variance. In this paper, we propose a new approach that leverages the tractability of probabilistic circuit models, such as Sum Product Networks (SPN), to compute ELBO gradients exactly (without sampling) for a certain class of densities. In particular, we show that selective-SPNs are suitable as an expressive variational distribution, and prove that when the log-density of the target model is a polynomial the corresponding ELBO can be computed analytically. To scale to graphical models with thousands of variables, we develop an efficient and effective construction of selective-SPNs with size O(kn), where n is the number of variables and k is an adjustable hyperparameter. We demonstrate our approach on three types of graphical models – Ising models, Latent Dirichlet Allocation, and factor graphs from the UAI Inference Competition. Selective-SPNs give a better lower bound than mean-field and structured mean-field, and is competitive with approximations that do not provide a lower bound, such as Loopy Belief Propagation and Tree-Reweighted Belief Propagation. Our results show that probabilistic circuits are promising tools for variational inference in discrete graphical models as they combine tractability and expressivity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2012

Fast Variational Inference in the Conjugate Exponential Family

We present a general method for deriving collapsed variational inference...
research
06/17/2014

Lifted Tree-Reweighted Variational Inference

We analyze variational inference for highly symmetric graphical models s...
research
08/02/2018

Winner-Take-All as Basic Probabilistic Inference Unit of Neuronal Circuits

Experimental observations of neuroscience suggest that the brain is work...
research
10/30/2017

A Connection between Feed-Forward Neural Networks and Probabilistic Graphical Models

Two of the most popular modelling paradigms in computer vision are feed-...
research
02/08/2022

PGMax: Factor Graphs for Discrete Probabilistic Graphical Models and Loopy Belief Propagation in JAX

PGMax is an open-source Python package for easy specification of discret...
research
02/23/2015

Scalable Variational Inference in Log-supermodular Models

We consider the problem of approximate Bayesian inference in log-supermo...
research
06/01/2019

Smoothing Structured Decomposable Circuits

We study the task of smoothing a circuit, i.e., ensuring that all childr...

Please sign up or login with your details

Forgot password? Click here to reset