Score-based Conditional Generation with Fewer Labeled Data by Self-calibrating Classifier Guidance

by   Paul Kuo-Ming Huang, et al.

Score-based Generative Models (SGMs) are a popular family of deep generative models that achieves leading image generation quality. Earlier studies have extended SGMs to tackle class-conditional generation by coupling an unconditional SGM with the guidance of a trained classifier. Nevertheless, such classifier-guided SGMs do not always achieve accurate conditional generation, especially when trained with fewer labeled data. We argue that the issue is rooted in unreliable gradients of the classifier and the inability to fully utilize unlabeled data during training. We then propose to improve classifier-guided SGMs by letting the classifier calibrate itself. Our key idea is to use principles from energy-based models to convert the classifier as another view of the unconditional SGM. Then, existing loss for the unconditional SGM can be adopted to calibrate the classifier using both labeled and unlabeled data. Empirical results validate that the proposed approach significantly improves the conditional generation quality across different percentages of labeled data. The improved performance makes the proposed approach consistently superior to other conditional SGMs when using fewer labeled data. The results confirm the potential of the proposed approach for generative modeling with limited labeled data.


page 14

page 15

page 16

page 18

page 19

page 23

page 24

page 25


Soft Curriculum for Learning Conditional GANs with Noisy-Labeled and Uncurated Unlabeled Data

Label-noise or curated unlabeled data is used to compensate for the assu...

Hybrid VAE: Improving Deep Generative Models using Partial Observations

Deep neural network models trained on large labeled datasets are the sta...

OSSGAN: Open-Set Semi-Supervised Image Generation

We introduce a challenging training scheme of conditional GANs, called o...

Conditioning Score-Based Generative Models by Neuro-Symbolic Constraints

Score-based and diffusion models have emerged as effective approaches fo...

Towards Practical Plug-and-Play Diffusion Models

Diffusion-based generative models have achieved remarkable success in im...

Classify and Generate Reciprocally: Simultaneous Positive-Unlabelled Learning and Conditional Generation with Extra Data

The scarcity of class-labeled data is a ubiquitous bottleneck in a wide ...

High-Fidelity Image Generation With Fewer Labels

Deep generative models are becoming a cornerstone of modern machine lear...

Please sign up or login with your details

Forgot password? Click here to reset