FL-NTK: A Neural Tangent Kernel-based Framework for Federated Learning Convergence Analysis

05/11/2021
by   Baihe Huang, et al.
4

Federated Learning (FL) is an emerging learning scheme that allows different distributed clients to train deep neural networks together without data sharing. Neural networks have become popular due to their unprecedented success. To the best of our knowledge, the theoretical guarantees of FL concerning neural networks with explicit forms and multi-step updates are unexplored. Nevertheless, training analysis of neural networks in FL is non-trivial for two reasons: first, the objective loss function we are optimizing is non-smooth and non-convex, and second, we are even not updating in the gradient direction. Existing convergence results for gradient descent-based methods heavily rely on the fact that the gradient direction is used for updating. This paper presents a new class of convergence analysis for FL, Federated Learning Neural Tangent Kernel (FL-NTK), which corresponds to overparamterized ReLU neural networks trained by gradient descent in FL and is inspired by the analysis in Neural Tangent Kernel (NTK). Theoretically, FL-NTK converges to a global-optimal solution at a linear rate with properly tuned learning parameters. Furthermore, with proper distributional assumptions, FL-NTK can also achieve good generalization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2022

Multi-Model Federated Learning with Provable Guarantees

Federated Learning (FL) is a variant of distributed learning where edge ...
research
09/18/2020

Federated Learning with Nesterov Accelerated Gradient Momentum Method

Federated learning (FL) is a fast-developing technique that allows multi...
research
06/22/2022

Federated Latent Class Regression for Hierarchical Data

Federated Learning (FL) allows a number of agents to participate in trai...
research
10/05/2022

FedMT: Federated Learning with Mixed-type Labels

In federated learning (FL), classifiers (e.g., deep networks) are traine...
research
10/18/2019

Federated Learning with Unbiased Gradient Aggregation and Controllable Meta Updating

Federated Averaging (FedAvg) serves as the fundamental framework in Fede...
research
03/28/2022

Federated Learning with Position-Aware Neurons

Federated Learning (FL) fuses collaborative models from local nodes with...
research
05/10/2022

A Communication-Efficient Distributed Gradient Clipping Algorithm for Training Deep Neural Networks

In distributed training of deep neural networks or Federated Learning (F...

Please sign up or login with your details

Forgot password? Click here to reset