Stable Parallel Training of Wasserstein Conditional Generative Adversarial Neural Networks

by   Massimiliano Lupo Pasini, et al.
Oak Ridge National Laboratory

We propose a stable, parallel approach to train Wasserstein Conditional Generative Adversarial Neural Networks (W-CGANs) under the constraint of a fixed computational budget. Differently from previous distributed GANs training techniques, our approach avoids inter-process communications, reduces the risk of mode collapse and enhances scalability by using multiple generators, each one of them concurrently trained on a single data label. The use of the Wasserstein metric also reduces the risk of cycling by stabilizing the training of each generator. We illustrate the approach on the CIFAR10, CIFAR100, and ImageNet1k datasets, three standard benchmark image datasets, maintaining the original resolution of the images for each dataset. Performance is assessed in terms of scalability and final accuracy within a limited fixed computational time and computational resources. To measure accuracy, we use the inception score, the Frechet inception distance, and image quality. An improvement in inception score and Frechet inception distance is shown in comparison to previous results obtained by performing the parallel approach on deep convolutional conditional generative adversarial neural networks (DC-CGANs) as well as an improvement of image quality of the new images created by the GANs approach. Weak scaling is attained on both datasets using up to 2,000 NVIDIA V100 GPUs on the OLCF supercomputer Summit.


page 11

page 12

page 13

page 14

page 15

page 16


Scalable Balanced Training of Conditional Generative Adversarial Neural Networks on Image Data

We propose a distributed approach to train deep convolutional generative...

DepthwiseGANs: Fast Training Generative Adversarial Networks for Realistic Image Synthesis

Recent work has shown significant progress in the direction of synthetic...

Complexity Controlled Generative Adversarial Networks

One of the issues faced in training Generative Adversarial Nets (GANs) a...

Evaluating generative networks using Gaussian mixtures of image features

We develop a measure for evaluating the performance of generative networ...

ViT-Inception-GAN for Image Colourising

Studies involving colourising images has been garnering researchers' kee...

Is Generator Conditioning Causally Related to GAN Performance?

Recent work (Pennington et al, 2017) suggests that controlling the entir...

Projected GANs Converge Faster

Generative Adversarial Networks (GANs) produce high-quality images but a...

Please sign up or login with your details

Forgot password? Click here to reset