Efficient and Private Federated Learning with Partially Trainable Networks

10/06/2021
by   Hakim Sidahmed, et al.
8

Federated learning is used for decentralized training of machine learning models on a large number (millions) of edge mobile devices. It is challenging because mobile devices often have limited communication bandwidth and local computation resources. Therefore, improving the efficiency of federated learning is critical for scalability and usability. In this paper, we propose to leverage partially trainable neural networks, which freeze a portion of the model parameters during the entire training process, to reduce the communication cost with little implications on model performance. Through extensive experiments, we empirically show that Federated learning of Partially Trainable neural networks (FedPT) can result in superior communication-accuracy trade-offs, with up to 46× reduction in communication cost, at a small accuracy cost. Our approach also enables faster training, with a smaller memory footprint, and better utility for strong differential privacy guarantees. The proposed FedPT method can be particularly interesting for pushing the limitations of overparameterization in on-device learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2021

Toward Communication Efficient Adaptive Gradient Method

In recent years, distributed optimization is proven to be an effective a...
research
11/05/2019

Enhancing the Privacy of Federated Learning with Sketching

In response to growing concerns about user privacy, federated learning h...
research
01/28/2021

Differential Privacy Meets Federated Learning under Communication Constraints

The performance of federated learning systems is bottlenecked by communi...
research
11/25/2020

Toward Multiple Federated Learning Services Resource Sharing in Mobile Edge Networks

Federated Learning is a new learning scheme for collaborative training a...
research
05/25/2020

Towards Efficient Scheduling of Federated Mobile Devices under Computational and Statistical Heterogeneity

Originated from distributed learning, federated learning enables privacy...
research
08/20/2021

FedSkel: Efficient Federated Learning on Heterogeneous Systems with Skeleton Gradients Update

Federated learning aims to protect users' privacy while performing data ...
research
09/19/2023

Sparser Random Networks Exist: Enforcing Communication-Efficient Federated Learning via Regularization

This work presents a new method for enhancing communication efficiency i...

Please sign up or login with your details

Forgot password? Click here to reset