Improved Fletcher–Reeves and Dai–Yuan conjugate gradient methods with the strong Wolfe line search

2019 
Abstract The conjugate gradient methods (CGMs) are very effective iterative methods for solving large-scale unconstrained optimization. The aim of this work is to improve the Fletcher–Reeves and Dai–Yuan CGMs. First, based on the conjugate parameters of the Fletcher–Reeves (FR) method and the Dai–Yuan (DY) method, and combining the second inequality of the strong Wolfe line search, two new conjugate parameters are constructed. Second, using the two new conjugate parameters, another FR type conjugate parameter is presented. Third, utilizing the strong Wolfe line search to yield the steplength, three improved CGMs are proposed for large-scale unconstrained optimization. Under usual assumptions, the improved methods are all proved to possess sufficient descent property and global convergence. Finally, three group experiments and their corresponding performance profiles are reported, which show that the proposed methods are very promising.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    25
    References
    15
    Citations
    NaN
    KQI
    []