An Estimate Sequence for Geodesically Convex Optimization

2018 
In this paper, we propose a Riemannian version of Nesterov's Accelerated Gradient algorithm (R-AGD), and show that for geodesically smooth and strongly convex problems, within a neighborhood whose radius depends on the condition number as well as the sectional curvature of the manifold, R-AGD converges to the minimum with acceleration. Unlike the algorithm in (Liu et al., 2017) which requires the exact solution to a nonlinear equation which in turn may be intractable, our algorithm is constructive and efficiently implementable. Our proof exploits a new estimate sequence and a novel bound on the nonlinear metric distortion, both ideas may be of independent interest.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    23
    References
    28
    Citations
    NaN
    KQI
    []