Towards Federated Learning Under Resource Constraints via Layer-wise Training and Depth Dropout

09/11/2023
by   Pengfei Guo, et al.
0

Large machine learning models trained on diverse data have recently seen unprecedented success. Federated learning enables training on private data that may otherwise be inaccessible, such as domain-specific datasets decentralized across many clients. However, federated learning can be difficult to scale to large models when clients have limited resources. This challenge often results in a trade-off between model size and access to diverse data. To mitigate this issue and facilitate training of large models on edge devices, we introduce a simple yet effective strategy, Federated Layer-wise Learning, to simultaneously reduce per-client memory, computation, and communication costs. Clients train just a single layer each round, reducing resource costs considerably with minimal performance degradation. We also introduce Federated Depth Dropout, a complementary technique that randomly drops frozen layers during training, to further reduce resource usage. Coupling these two techniques enables us to effectively train significantly larger models on edge devices. Specifically, we reduce training memory usage by 5x or more in federated self-supervised representation learning and demonstrate that performance in downstream tasks is comparable to conventional federated self-supervised learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2021

Enabling On-Device Training of Speech Recognition Models with Federated Dropout

Federated learning can be used to train machine learning models on the e...
research
06/02/2023

Resource-Efficient Federated Hyperdimensional Computing

In conventional federated hyperdimensional computing (HDC), training lar...
research
11/08/2020

Adaptive Federated Dropout: Improving Communication Efficiency and Generalization for Federated Learning

With more regulations tackling users' privacy-sensitive data protection ...
research
08/30/2022

Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes

The widespread adoption of handheld devices have fueled rapid growth in ...
research
03/08/2023

Memory-adaptive Depth-wise Heterogenous Federated Learning

Federated learning is a promising paradigm that allows multiple clients ...
research
10/11/2021

ProgFed: Effective, Communication, and Computation Efficient Federated Learning by Progressive Training

Federated learning is a powerful distributed learning scheme that allows...
research
10/27/2022

Federated Graph Representation Learning using Self-Supervision

Federated graph representation learning (FedGRL) brings the benefits of ...

Please sign up or login with your details

Forgot password? Click here to reset