Improving Differentially Private Models with Active Learning

by   Zhengli Zhao, et al.

Broad adoption of machine learning techniques has increased privacy concerns for models trained on sensitive data such as medical records. Existing techniques for training differentially private (DP) models give rigorous privacy guarantees, but applying these techniques to neural networks can severely degrade model performance. This performance reduction is an obstacle to deploying private models in the real world. In this work, we improve the performance of DP models by fine-tuning them through active learning on public data. We introduce two new techniques - DIVERSEPUBLIC and NEARPRIVATE - for doing this fine-tuning in a privacy-aware way. For the MNIST and SVHN datasets, these techniques improve state-of-the-art accuracy for DP models while retaining privacy guarantees.


Fine-Tuning with Differential Privacy Necessitates an Additional Hyperparameter Search

Models need to be trained with privacy-preserving learning algorithms to...

Differentially Private Bias-Term only Fine-tuning of Foundation Models

We study the problem of differentially private (DP) fine-tuning of large...

DP^2-VAE: Differentially Private Pre-trained Variational Autoencoders

Modern machine learning systems achieve great success when trained on la...

DP-InstaHide: Provably Defusing Poisoning and Backdoor Attacks with Differentially Private Data Augmentations

Data poisoning and backdoor attacks manipulate training data to induce s...

Unlocking Accuracy and Fairness in Differentially Private Image Classification

Privacy-preserving machine learning aims to train models on private data...

Generation of Differentially Private Heterogeneous Electronic Health Records

Electronic Health Records (EHRs) are commonly used by the machine learni...

Differentially Private Learning with Margin Guarantees

We present a series of new differentially private (DP) algorithms with d...

Please sign up or login with your details

Forgot password? Click here to reset