Fast stochastic second-order method logarithmic in condition number

2019 
Abstract Optimization is an important issue in machine learning because many machine learning models are reformulated as optimization problems. Different kinds of machine learning algorithms mainly focus on minimizing their empirical loss like deep learning, logistic regression, and support vector machine. Because data is explosively growing, it is challenging to deal with a large-scale optimization problem. Recently, stochastic second-order methods have emerged to attract much attention due to their efficiency in each iteration. These methods show good performance on training machine learning algorithms like logistic regression and support vector machine. However, the computational complexity of existing stochastic second-order methods heavily depends on the condition number of the Hessian. In this paper, we propose a new Newton-like method called Preconditioned Newton Conjugate Gradient with Sketched Hessian (PNCG). The runtime complexity of PNCG is at most logarithmic in the condition number of the Hessian. PNCG exhibits advantages over existing subsampled Newton methods especially when the Hessian matrix in question is ill-conditioned. We also show that our method has good performance on training machine learning algorithm empirically. The results show consistent improvements in computational efficiency.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    41
    References
    1
    Citations
    NaN
    KQI
    []