Superfast second-order methods for unconstrained convex optimization

2020 
In this paper, we present new second-order methods with converge rate O(k^{-4}), where k is the iteration counter. This is faster that athe existing lower bound for this type of schemes [1,2], which is O(k^{-7/2}). Our progress can be explained by a finer specification of the problem class. The main idea of this approach consists in implementation of the third-order scheme from [15] using the second-order oracle. At each iteration of our method, we solve a nontrivial auxiliary problem by a linearly convergent scheme based on the relati e non-degeneracy condition [3, 10]. During this process, the Hessian of the objective function is computed once, and the gradient is computed O(ln 1/epsilon) times, where epsilon is the desired accuracy of the solution for our problem.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    10
    Citations
    NaN
    KQI
    []