Asynchronous Decentralized Optimization in Directed Networks
A popular asynchronous protocol for decentralized optimization is randomized gossip where a pair of neighbors concurrently update via pairwise averaging. In practice, this creates deadlocks and is vulnerable to information delays. It can also be problematic if a node is unable to response or has only access to its private-preserved local dataset. To address these issues simultaneously, this paper proposes an asynchronous decentralized algorithm, i.e. APPG, with directed communication where each node updates asynchronously and independently of any other node. If local functions are strongly-convex with Lipschitz-continuous gradients, each node of APPG converges to the same optimal solution at a rate of O(λ^k), where λ∈(0,1) and the virtual counter k increases by 1 no matter on which node updates. The superior performance of APPG is validated on a logistic regression problem against state-of-the-art methods in terms of linear speedup and system implementations.
READ FULL TEXT