A modified limited-memory BNS method for unconstrained minimization based on the conjugate directions idea
Abstract: A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization is proposed, which consist in corrections (derived from the idea of conjugate directions) of the used difference vectors for better satisfaction of previous quasi-Newton conditions. In comparison with , where a similar approach is used, correction vectors from more previous iterations can be applied here. For quadratic objective functions, the improvement of convergence is the best one in some sense, all stored corrected difference vectors are conjugate and the quasi-Newton conditions with these vectors are satisfied. Global convergence of the algorithm is established for convex sufficiently smooth functions. Numerical experiments demonstrate the e±ciency of the new method.
Keywords: Unconstrained minimization, variable metric methods, limited-memory methods, the BFGS update, conjugate directions, numerical results.
Category 1: Nonlinear Optimization
Category 2: Nonlinear Optimization (Unconstrained Optimization )
Citation: Technical report No. V 1203. Institute of Computer Science, Academy of Sciences of the Czech Republic. Prague, March 2014.
Entry Submitted: 03/12/2014
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|