Quasi-symplectic Langevin Variational Autoencoder

by   Zihao Wang, et al.

Variational autoencoder (VAE) as one of the well investigated generative model is very popular in nowadays neural learning research works. To leverage VAE in practical tasks which have high dimensions and huge dataset often face the problem of low variance evidence lower bounds construction. Markov chain Monte Carlo (MCMC) is an effective approach to tight the evidence lower bound (ELBO) for approximating the posterior distribution. Hamiltonian Variational Autoencoder (HVAE) is one of the effective MCMC inspired approaches for constructing the unbiased low-variance ELBO which is also amenable for reparameterization trick. The solution significantly improves the performance of the posterior estimation effectiveness, yet, a main drawback of HVAE is the leapfrog method need to access the posterior gradient twice which leads to bad inference efficiency performance and the GPU memory requirement is fair large. This flaw limited the application of Hamiltonian based inference framework for large scale networks inference. To tackle this problem, we propose a Quasi-symplectic Langevin Variational autoencoder (Langevin-VAE), which can be a significant improvement over resource usage efficiency. We qualitatively and quantitatively demonstrate the effectiveness of the Langevin-VAE compared to the state-of-art gradients informed inference framework.


Tutorial: Deriving the Standard Variational Autoencoder (VAE) Loss Function

In Bayesian machine learning, the posterior distribution is typically co...

Hamiltonian Variational Auto-Encoder

Variational Auto-Encoders (VAEs) have become very popular techniques to ...

Learning variational autoencoders via MCMC speed measures

Variational autoencoders (VAEs) are popular likelihood-based generative ...

Amortized Inference Regularization

The variational autoencoder (VAE) is a popular model for density estimat...

MCMC Variational Inference via Uncorrected Hamiltonian Annealing

Given an unnormalized target distribution we want to obtain approximate ...

Conditional Inference in Pre-trained Variational Autoencoders via Cross-coding

Variational Autoencoders (VAEs) are a popular generative model, but one ...

WHAI: Weibull Hybrid Autoencoding Inference for Deep Topic Modeling

To train an inference network jointly with a deep generative topic model...

Please sign up or login with your details

Forgot password? Click here to reset