Stackelberg GAN: Towards Provable Minimax Equilibrium via Multi-Generator Architectures

11/19/2018
by   Hongyang Zhang, et al.
8

We study the problem of alleviating the instability issue in the GAN training procedure via new architecture design. The discrepancy between the minimax and maximin objective values could serve as a proxy for the difficulties that the alternating gradient descent encounters in the optimization of GANs. In this work, we give new results on the benefits of multi-generator architecture of GANs. We show that the minimax gap shrinks to ϵ as the number of generators increases with rate O(1/ϵ). This improves over the best-known result of O(1/ϵ^2). At the core of our techniques is a novel application of Shapley-Folkman lemma to the generic minimax problem, where in the literature the technique was only known to work when the objective function is restricted to the Lagrangian function of a constraint optimization problem. Our proposed Stackelberg GAN performs well experimentally in both synthetic and real-world datasets, improving Fréchet Inception Distance by 14.61% over the previous multi-generator GANs on the benchmark datasets.

READ FULL TEXT

page 2

page 8

page 9

page 10

page 24

page 25

page 26

page 27

research
01/27/2019

Deconstructing Generative Adversarial Networks

We deconstruct the performance of GANs into three components: 1. Formu...
research
02/06/2018

Training Generative Adversarial Networks via Primal-Dual Subgradient Methods: A Lagrangian Perspective on GAN

We relate the minimax game of generative adversarial networks (GANs) to ...
research
10/06/2021

Solve Minimax Optimization by Anderson Acceleration

Many modern machine learning algorithms such as generative adversarial n...
research
06/10/2017

An Online Learning Approach to Generative Adversarial Networks

We consider the problem of training generative models with a Generative ...
research
02/23/2018

Is Generator Conditioning Causally Related to GAN Performance?

Recent work (Pennington et al, 2017) suggests that controlling the entir...
research
12/10/2021

Faster Single-loop Algorithms for Minimax Optimization without Strong Concavity

Gradient descent ascent (GDA), the simplest single-loop algorithm for no...
research
05/26/2020

Memory-Efficient Sampling for Minimax Distance Measures

Minimax distance measure extracts the underlying patterns and manifolds ...

Please sign up or login with your details

Forgot password? Click here to reset