A Universal Marginalizer for Amortized Inference in Generative Models

11/02/2017
by   Laura Douglas, et al.
0

We consider the problem of inference in a causal generative model where the set of available observations differs between data instances. We show how combining samples drawn from the graphical model with an appropriate masking function makes it possible to train a single neural network to approximate all the corresponding conditional marginal distributions and thus amortize the cost of inference. We further demonstrate that the efficiency of importance sampling may be improved by basing proposals on the output of the neural network. We also outline how the same network can be used to generate samples from an approximate joint posterior via a chain decomposition of the graph.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset