Preconditioning the limited-memory bfgs algorithm

2006 
For large scale unconstrained optimization, truncated Newton methods and limited memory quasi-Newton methods are most effective choices. Often, the truncated Newton method adopts a preconditioner to speed up the convergence. However, very little previous study has explored the possibility of using a preconditioner in the limited memory quasi-Newton method. In this thesis, we focus on the LBFGS method, which belongs to the limited memory quasi-Newton methods, and we implement and study this preconditioned LBFGS algorithm (PLBFGS). We did experiments on protein structure prediction problems with three kinds of preconditioners constructed from the protein potential energy function. We also did experiments on a set of general test problems using a general preconditioner, the incomplete Cholesky factorization preconditioner. The results show that PLBFGS method has significant performance improvement over LBFGS algorithm by using the preconditioners, and has advantages over the standard preconditioned truncated Newton method. We also proposed a dynamic preconditioner update interval strategy, which benefits both preconditioned LBFGS method and preconditioned truncated Newton method.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    1
    Citations
    NaN
    KQI
    []