FedTrip: A Resource-Efficient Federated Learning Method with Triplet Regularization

04/12/2023
by   Xujing Li, et al.
0

In the federated learning scenario, geographically distributed clients collaboratively train a global model. Data heterogeneity among clients significantly results in inconsistent model updates, which evidently slow down model convergence. To alleviate this issue, many methods employ regularization terms to narrow the discrepancy between client-side local models and the server-side global model. However, these methods impose limitations on the ability to explore superior local models and ignore the valuable information in historical models. Besides, although the up-to-date representation method simultaneously concerns the global and historical local models, it suffers from unbearable computation cost. To accelerate convergence with low resource consumption, we innovatively propose a model regularization method named FedTrip, which is designed to restrict global-local divergence and decrease current-historical correlation for alleviating the negative effects derived from data heterogeneity. FedTrip helps the current local model to be close to the global model while keeping away from historical local models, which contributes to guaranteeing the consistency of local updates among clients and efficiently exploring superior local models with negligible additional computation cost on attaching operations. Empirically, we demonstrate the superiority of FedTrip via extensive evaluations. To achieve the target accuracy, FedTrip outperforms the state-of-the-art baselines in terms of significantly reducing the total overhead of client-server communication and local computation.

READ FULL TEXT
research
01/10/2022

Communication-Efficient Federated Learning with Acceleration of Global Momentum

Federated learning often suffers from unstable and slow convergence due ...
research
09/21/2022

FedFOR: Stateless Heterogeneous Federated Learning with First-Order Regularization

Federated Learning (FL) seeks to distribute model training across local ...
research
07/17/2023

FedCME: Client Matching and Classifier Exchanging to Handle Data Heterogeneity in Federated Learning

Data heterogeneity across clients is one of the key challenges in Federa...
research
10/20/2022

FedRecover: Recovering from Poisoning Attacks in Federated Learning using Historical Information

Federated learning is vulnerable to poisoning attacks in which malicious...
research
07/20/2023

Boosting Federated Learning Convergence with Prototype Regularization

As a distributed machine learning technique, federated learning (FL) req...
research
04/14/2022

Exploring the Distributed Knowledge Congruence in Proxy-data-free Federated Distillation

Federated learning (FL) is a distributed machine learning paradigm in wh...
research
06/09/2022

HideNseek: Federated Lottery Ticket via Server-side Pruning and Sign Supermask

Federated learning alleviates the privacy risk in distributed learning b...

Please sign up or login with your details

Forgot password? Click here to reset