Sobolev Wasserstein GAN

12/07/2020
by   Minkai Xu, et al.
10

Wasserstein GANs (WGANs), built upon the Kantorovich-Rubinstein (KR) duality of Wasserstein distance, is one of the most theoretically sound GAN models. However, in practice it does not always outperform other variants of GANs. This is mostly due to the imperfect implementation of the Lipschitz condition required by the KR duality. Extensive work has been done in the community with different implementations of the Lipschitz constraint, which, however, is still hard to satisfy the restriction perfectly in practice. In this paper, we argue that the strong Lipschitz constraint might be unnecessary for optimization. Instead, we take a step back and try to relax the Lipschitz constraint. Theoretically, we first demonstrate a more general dual form of the Wasserstein distance called the Sobolev duality, which relaxes the Lipschitz constraint but still maintains the favorable gradient property of the Wasserstein distance. Moreover, we show that the KR duality is actually a special case of the Sobolev duality. Based on the relaxed duality, we further propose a generalized WGAN training scheme named Sobolev Wasserstein GAN (SWGAN), and empirically demonstrate the improvement of SWGAN over existing methods with extensive experiments.

READ FULL TEXT

page 8

page 21

page 22

page 23

research
07/02/2018

Understanding the Effectiveness of Lipschitz Constraint in Training of GANs via Gradient Analysis

This paper aims to bring a new perspective for understanding GANs, by de...
research
02/15/2019

Lipschitz Generative Adversarial Nets

In this paper we study the convergence of generative adversarial network...
research
11/29/2019

Orthogonal Wasserstein GANs

Wasserstein-GANs have been introduced to address the deficiencies of gen...
research
11/18/2018

GAN-QP: A Novel GAN Framework without Gradient Vanishing and Lipschitz Constraint

We know SGAN may have a risk of gradient vanishing. A significant improv...
research
02/26/2021

Moreau-Yosida f-divergences

Variational representations of f-divergences are central to many machine...
research
04/30/2022

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

We present a short and elementary proof of the duality for Wasserstein d...
research
05/19/2017

Relaxed Wasserstein with Applications to GANs

We propose a novel class of statistical divergences called Relaxed Wasse...

Please sign up or login with your details

Forgot password? Click here to reset