Federated Learning on Non-iid Data via Local and Global Distillation

06/26/2023
by   Xiaolin Zheng, et al.
0

Most existing federated learning algorithms are based on the vanilla FedAvg scheme. However, with the increase of data complexity and the number of model parameters, the amount of communication traffic and the number of iteration rounds for training such algorithms increases significantly, especially in non-independently and homogeneously distributed scenarios, where they do not achieve satisfactory performance. In this work, we propose FedND: federated learning with noise distillation. The main idea is to use knowledge distillation to optimize the model training process. In the client, we propose a self-distillation method to train the local model. In the server, we generate noisy samples for each client and use them to distill other clients. Finally, the global model is obtained by the aggregation of local models. Experimental results show that the algorithm achieves the best performance and is more communication-efficient than state-of-the-art methods.

READ FULL TEXT

page 3

page 6

research
06/30/2021

Global Knowledge Distillation in Federated Learning

Knowledge distillation has caught a lot of attention in Federated Learni...
research
08/30/2021

FedKD: Communication Efficient Federated Learning via Knowledge Distillation

Federated learning is widely used to learn intelligent models from decen...
research
08/07/2023

Adapter-based Selective Knowledge Distillation for Federated Multi-domain Meeting Summarization

Meeting summarization has emerged as a promising technique for providing...
research
09/04/2020

FedDistill: Making Bayesian Model Ensemble Applicable to Federated Learning

Federated learning aims to leverage users' own data and computational re...
research
09/17/2020

Distilled One-Shot Federated Learning

Current federated learning algorithms take tens of communication rounds ...
research
03/30/2020

Adaptive Personalized Federated Learning

Investigation of the degree of personalization in federated learning alg...
research
07/14/2022

Multi-Level Branched Regularization for Federated Learning

A critical challenge of federated learning is data heterogeneity and imb...

Please sign up or login with your details

Forgot password? Click here to reset