On the regularization of Wasserstein GANs

by   Henning Petzka, et al.
University of Bonn

Since their invention, generative adversarial networks (GANs) have become a popular approach for learning to model a distribution of real (unlabeled) data. Convergence problems during training are overcome by Wasserstein GANs which minimize the distance between the model and the empirical distribution in terms of a different metric, but thereby introduce a Lipschitz constraint into the optimization problem. A simple way to enforce the Lipschitz constraint on the class of functions, which can be modeled by the neural network, is weight clipping. It was proposed that training can be improved by instead augmenting the loss by a regularization term that penalizes the deviation of the gradient of the critic (as a function of the network's input) from one. We present theoretical arguments why using a weaker regularization term enforcing the Lipschitz constraint is preferable. These arguments are supported by experimental results on toy data sets.


page 1

page 2

page 3

page 4


Virtual Adversarial Lipschitz Regularization

Generative adversarial networks (GANs) are one of the most popular appro...

Orthogonal Wasserstein GANs

Wasserstein-GANs have been introduced to address the deficiencies of gen...

Language Modeling with Generative AdversarialNetworks

Generative Adversarial Networks (GANs) have been promising in the field ...

Lipschitz Generative Adversarial Nets

In this paper we study the convergence of generative adversarial network...

Understanding the Effectiveness of Lipschitz Constraint in Training of GANs via Gradient Analysis

This paper aims to bring a new perspective for understanding GANs, by de...

HRVGAN: High Resolution Video Generation using Spatio-Temporal GAN

In this paper, we present a novel network for high resolution video gene...

Moreau-Yosida f-divergences

Variational representations of f-divergences are central to many machine...

Please sign up or login with your details

Forgot password? Click here to reset