: Downlink Compression for Cross-Device Federated Learning

02/01/2023
by   Ron Dorfman, et al.
0

Many compression techniques have been proposed to reduce the communication overhead of Federated Learning training procedures. However, these are typically designed for compressing model updates, which are expected to decay throughout training. As a result, such methods are inapplicable to downlink (i.e., from the parameter server to clients) compression in the cross-device setting, where heterogeneous clients may appear only once during training and thus must download the model parameters. In this paper, we propose a new framework () for downlink compression in the cross-device federated learning setting. Importantly, can be seamlessly combined with many uplink compression schemes, rendering it suitable for bi-directional compression. Through extensive evaluation, we demonstrate that offers significant bi-directional bandwidth reduction while achieving competitive accuracy to that of without compression.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset