The Interpolated MVU Mechanism For Communication-efficient Private Federated Learning

11/08/2022
by   Chuan Guo, et al.
0

We consider private federated learning (FL), where a server aggregates differentially private gradient updates from a large number of clients in order to train a machine learning model. The main challenge is balancing privacy with both classification accuracy of the learned model as well as the amount of communication between the clients and server. In this work, we build on a recently proposed method for communication-efficient private FL – the MVU mechanism – by introducing a new interpolation mechanism that can accommodate a more efficient privacy analysis. The result is the new Interpolated MVU mechanism that provides SOTA results on communication-efficient private FL on a variety of datasets.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset