Novelty Detection via Robust Variational Autoencoding

06/09/2020
by   Chieh-Hsin Lai, et al.
0

We propose a new method for novelty detection that can tolerate nontrivial corruption of the training points. Previous works assumed either no or very low corruption. Our method trains a robust variational autoencoder (VAE), which aims to generate a model for the uncorrupted training points. To gain robustness to corruption, we incorporate three changes to the common VAE: 1. Modeling the latent distribution as a mixture of Gaussian inliers and outliers, while using only the inlier component when testing; 2. Applying the Wasserstein-1 metric for regularization, instead of Kullback-Leibler divergence; and 3. Using a least absolute deviation error for reconstruction, which is equivalent to assuming a heavy-tailed likelihood. We illustrate state-of-the-art results on standard benchmark datasets for novelty detection.

READ FULL TEXT
research
05/23/2019

Robust Variational Autoencoder

Machine learning methods often need a large amount of labeled training d...
research
11/30/2021

Exponentially Tilted Gaussian Prior for Variational Autoencoder

An important propertyfor deep neural networks to possess is the ability ...
research
06/15/2020

Robust Variational Autoencoder for Tabular Data with Beta Divergence

We propose a robust variational autoencoder with β divergence for tabula...
research
02/04/2022

Robust Vector Quantized-Variational Autoencoder

Image generative models can learn the distributions of the training data...
research
04/03/2018

Training VAEs Under Structured Residuals

Variational auto-encoders (VAEs) are a popular and powerful deep generat...
research
11/27/2019

Novelty Detection Via Blurring

Conventional out-of-distribution (OOD) detection schemes based on variat...
research
03/12/2020

ARAE: Adversarially Robust Training of Autoencoders Improves Novelty Detection

Autoencoders (AE) have recently been widely employed to approach the nov...

Please sign up or login with your details

Forgot password? Click here to reset