A Unified f-divergence Framework Generalizing VAE and GAN

05/11/2022
by   Jaime Roquero Gimenez, et al.
0

Developing deep generative models that flexibly incorporate diverse measures of probability distance is an important area of research. Here we develop an unified mathematical framework of f-divergence generative model, f-GM, that incorporates both VAE and f-GAN, and enables tractable learning with general f-divergences. f-GM allows the experimenter to flexibly design the f-divergence function without changing the structure of the networks or the learning procedure. f-GM jointly models three components: a generator, a inference network and a density estimator. Therefore it simultaneously enables sampling, posterior inference of the latent variable as well as evaluation of the likelihood of an arbitrary datum. f-GM belongs to the class of encoder-decoder GANs: our density estimator can be interpreted as playing the role of a discriminator between samples in the joint space of latent code and observed space. We prove that f-GM naturally simplifies to the standard VAE and to f-GAN as special cases, and illustrates the connections between different encoder-decoder GAN architectures. f-GM is compatible with general network architecture and optimizer. We leverage it to experimentally explore the effects – e.g. mode collapse and image sharpness – of different choices of f-divergence.

READ FULL TEXT
research
07/16/2018

Variational Inference: A Unified Framework of Generative Models and Some Revelations

We reinterpreting the variational inference in a new perspective. Via th...
research
09/27/2019

"Best-of-Many-Samples" Distribution Matching

Generative Adversarial Networks (GANs) can achieve state-of-the-art samp...
research
06/19/2019

LIA: Latently Invertible Autoencoder with Adversarial Learning

Deep generative models play an increasingly important role in machine le...
research
03/14/2019

Diagnosing and Enhancing VAE Models

Although variational autoencoders (VAEs) represent a widely influential ...
research
11/07/2017

Theoretical limitations of Encoder-Decoder GAN architectures

Encoder-decoder GANs architectures (e.g., BiGAN and ALI) seek to add an ...
research
05/26/2019

OOGAN: Disentangling GAN with One-Hot Sampling and Orthogonal Regularization

Exploring the potential of GANs for unsupervised disentanglement learnin...
research
09/30/2019

Towards Diverse Paraphrase Generation Using Multi-Class Wasserstein GAN

Paraphrase generation is an important and challenging natural language p...

Please sign up or login with your details

Forgot password? Click here to reset