Decentralized Federated Learning via Non-Coherent Over-the-Air Consensus

10/27/2022
by   Nicolo Michelusi, et al.
0

This paper presents NCOTA-DGD, a Decentralized Gradient Descent (DGD) algorithm that combines local gradient descent with Non-Coherent Over-The-Air (NCOTA) consensus at the receivers to solve distributed machine-learning problems over wirelessly-connected systems. NCOTA-DGD leverages the waveform superposition properties of the wireless channels: it enables simultaneous transmissions under half-duplex constraints, by mapping local signals to a mixture of preamble sequences, and consensus via non-coherent combining at the receivers. NCOTA-DGD operates without channel state information and leverages the average channel pathloss to mix signals, without explicit knowledge of the mixing weights (typically known in consensus-based optimization algorithms). It is shown both theoretically and numerically that, for smooth and strongly-convex problems with fixed consensus and learning stepsizes, the updates of NCOTA-DGD converge (in Euclidean distance) to the global optimum with rate 𝒪(K^-1/4) for a target number of iterations K. NCOTA-DGD is evaluated numerically over a logistic regression problem, showing faster convergence vis-à-vis running time than implementations of the classical DGD algorithm over digital and analog orthogonal channels.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset