Generative Class-conditional Autoencoders

12/22/2014
by   Jan Rudy, et al.
0

Recent work by Bengio et al. (2013) proposes a sampling procedure for denoising autoencoders which involves learning the transition operator of a Markov chain. The transition operator is typically unimodal, which limits its capacity to model complex data. In order to perform efficient sampling from conditional distributions, we extend this work, both theoretically and algorithmically, to gated autoencoders (Memisevic, 2013), The proposed model is able to generate convincing class-conditional samples when trained on both the MNIST and TFD datasets.

READ FULL TEXT

page 6

page 8

research
03/20/2017

Learning to Generate Samples from Noise through Infusion Training

In this work, we investigate a novel training procedure to learn a gener...
research
12/19/2013

Multimodal Transitions for Generative Stochastic Networks

Generative Stochastic Networks (GSNs) have been recently introduced as a...
research
04/27/2023

Resampling Gradients Vanish in Differentiable Sequential Monte Carlo Samplers

Annealed Importance Sampling (AIS) moves particles along a Markov chain ...
research
03/18/2015

GSNs : Generative Stochastic Networks

We introduce a novel training principle for probabilistic models that is...
research
11/07/2017

Variational Walkback: Learning a Transition Operator as a Stochastic Recurrent Net

We propose a novel method to directly learn a stochastic transition oper...
research
05/21/2017

Image Segmentation by Iterative Inference from Conditional Score Estimation

Inspired by the combination of feedforward and iterative computations in...
research
05/05/2022

Generative methods for sampling transition paths in molecular dynamics

Molecular systems often remain trapped for long times around some local ...

Please sign up or login with your details

Forgot password? Click here to reset