Global performance guarantees of second-order methods for unconstrained convex minimization

2018 
In this paper we make an attempt to compare two distinct branches of research on second-order optimization methods. The first one studies self-concordant functions and barriers, the main assumption being that the third derivative of the objective is bounded by the second derivative. The second branch studies cubic regularized Newton methods with main assumption that the second derivative is Lipschitz continuous. We develop new theoretical analysis for a path-following scheme for general self-concordant function, as opposed to classical path-following scheme developed for self-concordant barriers. We show that the complexity bound for this scheme is better than for Damped Newton Method. Next, we analyze an important subclass of general self-concordant function, namely a class of strongly convex functions with Lipschitz continuous second derivative and show that for this subclass cubic regularized New Methods give even better complexity bound.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    5
    Citations
    NaN
    KQI
    []