Communication-Efficient Federated Learning with Acceleration of Global Momentum

01/10/2022
by   Geeho Kim, et al.
0

Federated learning often suffers from unstable and slow convergence due to heterogeneous characteristics of participating clients. Such tendency is aggravated when the client participation ratio is low since the information collected from the clients at each round is prone to be more inconsistent. To tackle the challenge, we propose a novel federated learning framework, which improves the stability of the server-side aggregation step, which is achieved by sending the clients an accelerated model estimated with the global gradient to guide the local gradient updates. Our algorithm naturally aggregates and conveys the global update information to participants with no additional communication cost and does not require to store the past models in the clients. We also regularize local update to further reduce the bias and improve the stability of local updates. We perform comprehensive empirical studies on real data under various settings and demonstrate the remarkable performance of the proposed method in terms of accuracy and communication-efficiency compared to the state-of-the-art methods, especially with low client participation rates. Our code is available at https://github.com/ ninigapa0/FedAGM

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/17/2022

Improving Federated Learning Communication Efficiency with Global Momentum Fusion for Gradient Compression Schemes

Communication costs within Federated learning hinder the system scalabil...
research
04/12/2023

FedTrip: A Resource-Efficient Federated Learning Method with Triplet Regularization

In the federated learning scenario, geographically distributed clients c...
research
07/14/2022

Multi-Level Branched Regularization for Federated Learning

A critical challenge of federated learning is data heterogeneity and imb...
research
06/13/2022

Accelerating Federated Learning via Sampling Anchor Clients with Large Batches

Using large batches in recent federated learning studies has improved co...
research
12/15/2021

LoSAC: An Efficient Local Stochastic Average Control Method for Federated Optimization

Federated optimization (FedOpt), which targets at collaboratively traini...
research
11/18/2021

A Novel Optimized Asynchronous Federated Learning Framework

Federated Learning (FL) since proposed has been applied in many fields, ...
research
03/28/2023

Communication-Efficient Vertical Federated Learning with Limited Overlapping Samples

Federated learning is a popular collaborative learning approach that ena...

Please sign up or login with your details

Forgot password? Click here to reset