Do GAN Loss Functions Really Matter?

11/23/2018
by   Yipeng Qin, et al.
30

In this paper, we address the recent controversy between Lipschitz regularization and the choice of loss function for the training of Generative Adversarial Networks (GANs). One side argues that the success of the GAN training should be attributed to the choice of loss function [16, 2, 5], while the other suggests that the Lipschitz regularization is the key to good results [17, 3, 18, 19]. We provide a theoretical and experimental analysis of how Lipschitz regularization interacts with the loss function to derive the following insights: (i) We show that popular GANs (NS-GAN [4], LS-GAN [16], WGAN [2]) perform equally well when the discriminator is regularized with a small Lipschitz constant, but the performance in terms of quality and diversity gets worse for larger Lipschitz constants, except for WGAN. (ii) We show that all loss functions degenerate to linear ones for small Lipschitz constants to explain why the performance of these GANs is similar. For higher Lipschitz constants, we observe that only WGAN performs well while NS-GAN and LS-GAN break down. For lower Lipschitz constants, NS-GAN and LS-GAN perform similarly to WGAN only because they degenerate to the WGAN loss. In order to further illustrate this issue, we demonstrate that even "ridiculous" loss functions such as sin and exp have similar performance to NS-GAN, LS-GAN, and WGAN, when small Lipschitz constants are used.

READ FULL TEXT

page 13

page 15

page 17

page 21

page 22

page 23

page 24

page 25

research
11/04/2021

GraN-GAN: Piecewise Gradient Normalization for Generative Adversarial Networks

Modern generative adversarial networks (GANs) predominantly use piecewis...
research
01/23/2017

Loss-Sensitive Generative Adversarial Networks on Lipschitz Densities

*New Theory Result* We analyze the generalizability of the LS-GAN, showi...
research
12/24/2018

Improving MMD-GAN Training with Repulsive Loss Function

Generative adversarial nets (GANs) are widely used to learn the data sam...
research
04/06/2021

Generalization of GANs under Lipschitz continuity and data augmentation

Generative adversarial networks (GANs) have been being widely used in va...
research
07/12/2021

Prb-GAN: A Probabilistic Framework for GAN Modelling

Generative adversarial networks (GANs) are very popular to generate real...
research
03/29/2023

Lipschitzness Effect of a Loss Function on Generalization Performance of Deep Neural Networks Trained by Adam and AdamW Optimizers

The generalization performance of deep neural networks with regard to th...
research
03/20/2019

Box-constrained monotone L_∞-approximations and Lipschitz-continuous regularized functions

Let f:[0,1]→[0,1] be a nondecreasing function. The main goal of this wor...

Please sign up or login with your details

Forgot password? Click here to reset