A modified scaled memoryless symmetric rank–one method

2020 
To guarantee heredity of positive definiteness under the popular Wolfe line search conditions, a modification is made on the symmetric rank–one updating formula, a simple quasi–Newton approximation for (inverse) Hessian of the objective function of an unconstrained optimization problem. Then, the scaling approach is employed on a memoryless version of the proposed formula, leading to an iterative method which is appropriate for solving large–scale problems. Based on an eigenvalue analysis, it is shown that the self–scaling parameter proposed by Oren and Spedicato is an optimal parameter for the proposed updating formula in the sense of minimizing the condition number. Also, a sufficient descent property is established for the method, together with a global convergence analysis for uniformly convex objective functions. Numerical experiments demonstrate computational efficiency of the proposed method with the self–scaling parameter proposed by Oren and Spedicato.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    20
    References
    2
    Citations
    NaN
    KQI
    []