Federated Learning in the Presence of Adversarial Client Unavailability

05/31/2023
by   Lili Su, et al.
0

Federated learning is a decentralized machine learning framework wherein not all clients are able to participate in each round. An emerging line of research is devoted to tackling arbitrary client unavailability. Existing theoretical analysis imposes restrictive structural assumptions on the unavailability patterns, and their proposed algorithms were tailored to those assumptions. In this paper, we relax those assumptions and consider adversarial client unavailability. To quantify the degrees of client unavailability, we use the notion of ϵ-adversary dropout fraction. For both non-convex and strongly-convex global objectives, we show that simple variants of FedAvg or FedProx, albeit completely agnostic to ϵ, converge to an estimation error on the order of ϵ (G^2 + σ^2), where G is a heterogeneity parameter and σ^2 is the noise level. We prove that this estimation error is minimax-optimal. We also show that the variants of FedAvg or FedProx have convergence speeds O(1/√(T)) for non-convex objectives and O(1/T) for strongly-convex objectives, both of which are the best possible for any first-order method that only has access to noisy gradients. Our proofs build upon a tight analysis of the selection bias that persists in the entire learning process. We validate our theoretical prediction through numerical experiments on synthetic and real-world datasets.

READ FULL TEXT

page 35

page 36

research
12/07/2020

Improved Convergence Rates for Non-Convex Federated Learning with Compression

Federated learning is a new distributed learning paradigm that enables e...
research
03/28/2022

FedADMM: A Federated Primal-Dual Algorithm Allowing Partial Participation

Federated learning is a framework for distributed optimization that plac...
research
02/25/2021

Achieving Linear Convergence in Federated Learning under Objective and Systems Heterogeneity

We consider a standard federated learning architecture where a group of ...
research
12/20/2020

Toward Understanding the Influence of Individual Clients in Federated Learning

Federated learning allows mobile clients to jointly train a global model...
research
01/26/2022

Server-Side Stepsizes and Sampling Without Replacement Provably Help in Federated Optimization

We present a theoretical study of server-side optimization in federated ...
research
03/18/2023

Byzantine-Resilient Federated Learning at Edge

Both Byzantine resilience and communication efficiency have attracted tr...
research
05/25/2023

Incentivizing Honesty among Competitors in Collaborative Learning and Optimization

Collaborative learning techniques have the potential to enable training ...

Please sign up or login with your details

Forgot password? Click here to reset