Self-Supervised Pretraining for Differentially Private Learning

06/14/2022
by   Arash Asadian, et al.
0

We demonstrate self-supervised pretraining (SSP) is a scalable solution to deep learning with differential privacy (DP) regardless of the size of available public datasets in image classification. When facing the lack of public datasets, we show the features generated by SSP on only one single image enable a private classifier to obtain much better utility than the non-learned handcrafted features under the same privacy budget. When a moderate or large size public dataset is available, the features produced by SSP greatly outperform the features trained with labels on various complex private datasets under the same private budget. We also compared multiple DP-enabled training frameworks to train a private classifier on the features generated by SSP. Finally, we report a non-trivial utility 25.3% of a private ImageNet-1K dataset when ϵ=3.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset