Friends to Help: Saving Federated Learning from Client Dropout

05/26/2022
by   Heqiang Wang, et al.
0

Federated learning (FL) is an outstanding distributed machine learning framework due to its benefits on data privacy and communication efficiency. Since full client participation in many cases is infeasible due to constrained resources, partial participation FL algorithms have been investigated that proactively select/sample a subset of clients, aiming to achieve learning performance close to the full participation case. This paper studies a passive partial client participation scenario that is much less well understood, where partial participation is a result of external events, namely client dropout, rather than a decision of the FL algorithm. We cast FL with client dropout as a special case of a larger class of FL problems where clients can submit substitute (possibly inaccurate) local model updates. Based on our convergence analysis, we develop a new algorithm FL-FDMS that discovers friends of clients (i.e., clients whose data distributions are similar) on-the-fly and uses friends' local updates as substitutes for the dropout clients, thereby reducing the substitution error and improving the convergence performance. A complexity reduction mechanism is also incorporated into FL-FDMS, making it both theoretically sound and practically useful. Experiments on MNIST and CIFAR-10 confirmed the superior performance of FL-FDMS in handling client dropout in FL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2022

DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

Federated learning (FL) strives to enable collaborative training of mach...
research
01/26/2022

Fast Server Learning Rate Tuning for Coded Federated Dropout

In cross-device Federated Learning (FL), clients with low computational ...
research
05/26/2022

FedAug: Reducing the Local Learning Bias Improves Federated Learning on Heterogeneous Data

Federated Learning (FL) is a machine learning paradigm that learns from ...
research
08/31/2023

FedDD: Toward Communication-efficient Federated Learning with Differential Parameter Dropout

Federated Learning (FL) requires frequent exchange of model parameters, ...
research
10/21/2021

Guess what? You can boost Federated Learning for free

Federated Learning (FL) exploits the computation power of edge devices, ...
research
07/14/2023

FedBIAD: Communication-Efficient and Accuracy-Guaranteed Federated Learning with Bayesian Inference-Based Adaptive Dropout

Federated Learning (FL) emerges as a distributed machine learning paradi...
research
07/05/2023

FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout

Federated Learning (FL) allows machine learning models to train locally ...

Please sign up or login with your details

Forgot password? Click here to reset