"An Enriched Algorithm for Large-Scale Nonlinear Optimization"
J.L. Morales, J. Nocedal.
Computational Optimization and Applications (COAP), Vol. 21, No. 2, pp. 143-154, (February 2002)
This paper describes a class of optimization methods that interlace iterations of the limited memory BFGS method (L-BFGS) and a Hessian-free Newton method (HFN) in such a way that the information collected by one type of iteration improves the performance of the other. Curvature information about the objective function is stored in the form of a limited memory matrix, and plays the dual role of preconditioning the inner conjugate gradient iteration in the HFN method and of providing a warm start for L-BFGS iterations. The lengths of the L-BFGS and HFN cycles are adjusted dynamically during the course of the optimization. Numerical experiments indicate that the the new algorithms is very effective and is not sensitive to the choice of parameters.