FedBIAD: Communication-Efficient and Accuracy-Guaranteed Federated Learning with Bayesian Inference-Based Adaptive Dropout

07/14/2023
by   Jingjing Xue, et al.
0

Federated Learning (FL) emerges as a distributed machine learning paradigm without end-user data transmission, effectively avoiding privacy leakage. Participating devices in FL are usually bandwidth-constrained, and the uplink is much slower than the downlink in wireless networks, which causes a severe uplink communication bottleneck. A prominent direction to alleviate this problem is federated dropout, which drops fractional weights of local models. However, existing federated dropout studies focus on random or ordered dropout and lack theoretical support, resulting in unguaranteed performance. In this paper, we propose Federated learning with Bayesian Inference-based Adaptive Dropout (FedBIAD), which regards weight rows of local models as probability distributions and adaptively drops partial weight rows based on importance indicators correlated with the trend of local training loss. By applying FedBIAD, each client adaptively selects a high-quality dropping pattern with accurate approximations and only transmits parameters of non-dropped weight rows to mitigate uplink costs while improving accuracy. Theoretical analysis demonstrates that the convergence rate of the average generalization error of FedBIAD is minimax optimal up to a squared logarithmic factor. Extensive experiments on image classification and next-word prediction show that compared with status quo approaches, FedBIAD provides 2x uplink reduction with an accuracy increase of up to 2.41 Distributed (non-IID) data, which brings up to 72

READ FULL TEXT
research
06/17/2020

Mix2FLD: Downlink Federated Learning After Uplink Federated Distillation With Two-Way Mixup

This letter proposes a novel communication-efficient and privacy-preserv...
research
12/07/2020

Design and Analysis of Uplink and Downlink Communications for Federated Learning

Communication has been known to be one of the primary bottlenecks of fed...
research
05/26/2022

Friends to Help: Saving Federated Learning from Client Dropout

Federated learning (FL) is an outstanding distributed machine learning f...
research
07/14/2023

Improved Convergence Analysis and SNR Control Strategies for Federated Learning in the Presence of Noise

We propose an improved convergence analysis technique that characterizes...
research
06/16/2022

Personalized Federated Learning via Variational Bayesian Inference

Federated learning faces huge challenges from model overfitting due to t...
research
07/20/2023

Communication-Efficient Federated Learning over Capacity-Limited Wireless Networks

In this paper, a communication-efficient federated learning (FL) framewo...
research
07/05/2023

FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout

Federated Learning (FL) allows machine learning models to train locally ...

Please sign up or login with your details

Forgot password? Click here to reset