DeepAI AI Chat
Log In Sign Up

Improving sample diversity of a pre-trained, class-conditional GAN by changing its class embeddings

by   Qi Li, et al.

Mode collapse is a well-known issue with Generative Adversarial Networks (GANs) and is a byproduct of unstable GAN training. We propose to improve the sample diversity of a pre-trained class-conditional generator by modifying its class embeddings in the direction of maximizing the log probability outputs of a classifier pre-trained on the same dataset. We improved the sample diversity of state-of-the-art ImageNet BigGANs at both 128x128 and 256x256 resolutions. By replacing the embeddings, we can also synthesize plausible images for Places365 using a BigGAN pre-trained on ImageNet.


page 15

page 16

page 17

page 20

page 24

page 26

page 27

page 31


Exploiting Pre-trained Feature Networks for Generative Adversarial Networks in Audio-domain Loop Generation

While generative adversarial networks (GANs) have been widely used in re...

FacEDiM: A Face Embedding Distribution Model for Few-Shot Biometric Authentication of Cattle

This work proposes to solve the problem of few-shot biometric authentica...

Mark My Words: Dangers of Watermarked Images in ImageNet

The utilization of pre-trained networks, especially those trained on Ima...

Virtual Conditional Generative Adversarial Networks

When trained on multimodal image datasets, normal Generative Adversarial...

Supervised GAN Watermarking for Intellectual Property Protection

We propose a watermarking method for protecting the Intellectual Propert...

Multiclass non-Adversarial Image Synthesis, with Application to Classification from Very Small Sample

The generation of synthetic images is currently being dominated by Gener...

Spider GAN: Leveraging Friendly Neighbors to Accelerate GAN Training

Training Generative adversarial networks (GANs) stably is a challenging ...

Code Repositories


BigGAN-AM improves the sample diversity of BigGAN and synthesizes Places365 images.

view repo