Variance-Reduced Decentralized Stochastic Optimization with Gradient Tracking -- Part II: GT-SVRG.

2019 
Decentralized stochastic optimization has recently benefited from gradient tracking methods~\cite{DSGT_Pu,DSGT_Xin} providing efficient solutions for large-scale empirical risk minimization problems. In Part I~\cite{GT_SAGA} of this work, we develop \textbf{\texttt{GT-SAGA}} that is based on a decentralized implementation of SAGA~\cite{SAGA} using gradient tracking and discuss regimes of practical interest where~\textbf{\texttt{GT-SAGA}} outperforms existing decentralized approaches in terms of the total number of local gradient computations. In this paper, we describe~\textbf{\texttt{GT-SVRG}} that develops a decentralized gradient tracking based implementation of SVRG~\cite{SVRG}, another well-known variance-reduction technique. We show that the convergence rate of~\textbf{\texttt{GT-SVRG}} matches that of~\textbf{\texttt{GT-SAGA}} for smooth and strongly-convex functions and highlight different trade-offs between the two algorithms in various settings.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    73
    References
    13
    Citations
    NaN
    KQI
    []