Disentangled Representation Learning with Information Maximizing Autoencoder

04/18/2019
by   Kazi Nazmul Haque, et al.
0

Learning disentangled representation from any unlabelled data is a non-trivial problem. In this paper we propose Information Maximising Autoencoder (InfoAE) where the encoder learns powerful disentangled representation through maximizing the mutual information between the representation and given information in an unsupervised fashion. We have evaluated our model on MNIST dataset and achieved 98.9 (± .1) % test accuracy while using complete unsupervised training.

READ FULL TEXT
research
08/06/2022

HSIC-InfoGAN: Learning Unsupervised Disentangled Representations by Maximising Approximated Mutual Information

Learning disentangled representations requires either supervision or the...
research
11/25/2019

Bridging Disentanglement with Independence and Conditional Independence via Mutual Information for Representation Learning

Existing works on disentangled representation learning usually lie on a ...
research
05/25/2019

The variational infomax autoencoder

We propose the Variational InfoMax AutoEncoder (VIMAE), a method to trai...
research
04/05/2022

LatentGAN Autoencoder: Learning Disentangled Latent Distribution

In autoencoder, the encoder generally approximates the latent distributi...
research
01/21/2021

Blocked and Hierarchical Disentangled Representation From Information Theory Perspective

We propose a novel and theoretical model, blocked and hierarchical varia...
research
06/20/2018

InfoCatVAE: Representation Learning with Categorical Variational Autoencoders

This paper describes InfoCatVAE, an extension of the variational autoenc...
research
02/21/2023

Scalable Infomin Learning

The task of infomin learning aims to learn a representation with high ut...

Please sign up or login with your details

Forgot password? Click here to reset