Proximal gradient method for nonconvex and nonsmooth optimization on Hadamard manifolds

2021 
In this paper, we address the minimizing problem of the nonconvex and nonsmooth functions on Hadamard manifolds, and develop an improved proximal gradient method. First, by utilizing the geometric structure of non-positive curvature manifolds, we propose a monotone proximal gradient algorithm with fixed step size on Hadamard manifolds. Then, a convergence theorem of the proposed method has been established under the reasonable definition of proximal gradient mapping on manifolds. If the function further satisfies the Riemannian Kurdyka-Łojasiewicz (KL) property with an exponent, the local convergence rate is given. Finally, numerical experiments on a special Hadamard manifold, named symmetric positive definite matrix manifold, show the advantages of the proposed method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    43
    References
    0
    Citations
    NaN
    KQI
    []