Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations

Limited-memory variable metric methods based on the well-known BFGS update are widely used for large scale optimization. The block version of the BFGS update, derived by Schnabel (1983), Hu and Storey (1991) and Vl·cek and Luk·san (2019), satis¯es the quasi-Newton equations with all used di®erence vectors and for quadratic objective functions gives the best improvement of convergence in some sense, but the corresponding direction vectors are not descent directions generally. To guarantee the descent property of direction vectors and simultaneously violate the quasi-Newton equations as little as possible in some sense, two methods based on the block BFGS update are proposed. They can be advantageously combined with methods based on vector corrections for conjugacy (Vlcek and Luksan, 2015). Global convergence of the proposed algorithm is established for convex and su±ciently smooth functions. Numerical experiments demonstrate the e±ciency of the new methods.

Citation

Technical report No. V 1280, Institute of Computer Science of the Czech Academy of Sciences.

Article

Download

View Two limited-memory optimization methods with minimum violation of the previous quasi-Newton equations