Privacy Amplification via Compression: Achieving the Optimal Privacy-Accuracy-Communication Trade-off in Distributed Mean Estimation

04/04/2023
by   Wei-Ning Chen, et al.
0

Privacy and communication constraints are two major bottlenecks in federated learning (FL) and analytics (FA). We study the optimal accuracy of mean and frequency estimation (canonical models for FL and FA respectively) under joint communication and (ε, δ)-differential privacy (DP) constraints. We show that in order to achieve the optimal error under (ε, δ)-DP, it is sufficient for each client to send Θ( n min(ε, ε^2)) bits for FL and Θ(log( nmin(ε, ε^2) )) bits for FA to the server, where n is the number of participating clients. Without compression, each client needs O(d) bits and log d bits for the mean and frequency estimation problems respectively (where d corresponds to the number of trainable parameters in FL or the domain size in FA), which means that we can get significant savings in the regime n min(ε, ε^2) = o(d), which is often the relevant regime in practice. Our algorithms leverage compression for privacy amplification: when each client communicates only partial information about its sample, we show that privacy can be amplified by randomly selecting the part contributed by each client.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset