Asynchronous Subgradient-push Algorithm for Distributed optimization over Directed Graphs

2019 
This paper proposes an asynchronous subgradient-push algorithm (AsySPA) to solve an additive cost optimization problem over digraphs where each node only has access to a local convex objective function. The striking feature of our asynchronous algorithm is that every node updates independently, without any coordination with other nodes by using local information. Thus, no global clock or synchronization scheme is needed and nodes can update their iterates under different rates. To address asynchrony among nodes and communication delays, we design a novel mechanism to adaptively adjust stepsizes per update in each node and construct a delay-free augmented system. Moreover, we propose a generalized subgradient algorithm, which provides a unified approach to study the standard subgradient algorithm and the celebrated incremental subgradient algorithm, to prove its exact convergence. Finally, we demonstrate the efficiency and advantages of the AsySPA against the current state-of-the-art by training a multi-class logistic regression classifier on a large real dataset.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    29
    References
    0
    Citations
    NaN
    KQI
    []