Generated Loss, Augmented Training, and Multiscale VAE

04/23/2019
by   Jason Chou, et al.
0

The variational autoencoder (VAE) framework remains a popular option for training unsupervised generative models, especially for discrete data where generative adversarial networks (GANs) require workaround to create gradient for the generator. In our work modeling US postal addresses, we show that our discrete VAE with tree recursive architecture demonstrates limited capability of capturing field correlations within structured data, even after overcoming the challenge of posterior collapse with scheduled sampling and tuning of the KL-divergence weight β. Worse, VAE seems to have difficulty mapping its generated samples to the latent space, as their VAE loss lags behind or even increases during the training process. Motivated by this observation, we show that augmenting training data with generated variants (augmented training) and training a VAE with multiple values of β simultaneously (multiscale VAE) both improve the generation quality of VAE. Despite their differences in motivation and emphasis, we show that augmented training and multiscale VAE are actually connected and have similar effects on the model.

READ FULL TEXT
research
04/24/2019

Generated Loss and Augmented Training of MNIST VAE

The variational autoencoder (VAE) framework is a popular option for trai...
research
01/09/2023

Multiscale Metamorphic VAE for 3D Brain MRI Synthesis

Generative modeling of 3D brain MRIs presents difficulties in achieving ...
research
08/06/2021

GLASS: Geometric Latent Augmentation for Shape Spaces

We investigate the problem of training generative models on a very spars...
research
02/15/2018

Quantum Variational Autoencoder

Variational autoencoders (VAEs) are powerful generative models with the ...
research
09/15/2023

Quantifying Credit Portfolio sensitivity to asset correlations with interpretable generative neural networks

In this research, we propose a novel approach for the quantification of ...
research
05/21/2020

Unsupposable Test-data Generation for Machine-learned Software

As for software development by machine learning, a trained model is eval...
research
06/27/2020

Deep Generative Modeling for Mechanistic-based Learning and Design of Metamaterial Systems

Metamaterials are emerging as a new paradigmatic material system to rend...

Please sign up or login with your details

Forgot password? Click here to reset