Faster Asynchronous SGD

by   Augustus Odena, et al.

Asynchronous distributed stochastic gradient descent methods have trouble converging because of stale gradients. A gradient update sent to a parameter server by a client is stale if the parameters used to calculate that gradient have since been updated on the server. Approaches have been proposed to circumvent this problem that quantify staleness in terms of the number of elapsed updates. In this work, we propose a novel method that quantifies staleness in terms of moving averages of gradient statistics. We show that this method outperforms previous methods with respect to convergence speed and scalability to many clients. We also discuss how an extension to this method can be used to dramatically reduce bandwidth costs in a distributed training context. In particular, our method allows reduction of total bandwidth usage by a factor of 5 with little impact on cost convergence. We also describe (and link to) a software library that we have used to simulate these algorithms deterministically on a single machine.


page 1

page 2

page 3

page 4


Making Asynchronous Stochastic Gradient Descent Work for Transformers

Asynchronous stochastic gradient descent (SGD) is attractive from a spee...

Distributed SGD Generalizes Well Under Asynchrony

The performance of fully synchronized distributed systems has faced a bo...

OD-SGD: One-step Delay Stochastic Gradient Descent for Distributed Training

The training of modern deep learning neural network calls for large amou...

Personalized Federated Learning with Exact Stochastic Gradient Descent

In Federated Learning (FL), datasets across clients tend to be heterogen...

Decentralized Federated Averaging

Federated averaging (FedAvg) is a communication efficient algorithm for ...

Taming Momentum in a Distributed Asynchronous Environment

Although distributed computing can significantly reduce the training tim...

Feature Whitening via Gradient Transformation for Improved Convergence

Feature whitening is a known technique for speeding up training of DNN. ...

Please sign up or login with your details

Forgot password? Click here to reset