Optimization Online


A limited-memory optimization method using the in¯nitely many times repeated BNS update and conjugate directions

Jan Vlcek(vlcel***at***cs.cas.cz)
Ladislav Luksan(luksan***at***cs.cas.cz)

Abstract: To improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed in [1, 2]. But the suitable extra updates need to be selected carefully, since the repeating process can be time consuming. We show that for the limited-memory variable metric BNS method, matrix updating can be e±ciently repeated in¯nitely many times under some conditions, with only a small increase of the number of arithmetic operations. The limit variable metric matrix can be written as a block BFGS update [22], which can be obtained by solving of some low-order Lyapunov matrix equation. The resulting method can be advantageously combined with methods based on vector corrections for conjugacy, see e.g. [21]. Global convergence of the proposed algorithm is established for convex and su±ciently smooth functions. Numerical experiments demonstrate the e±ciency of the new method.

Keywords: Unconstrained minimization, variable metric methods, limited-memory methods, the repeated BFGS update, global convergence, numerical results

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Citation: Technical report No. V 1245, Institute of Computer Science, Czech Academy of Sciences, Prague, March 2018

Download: [PDF]

Entry Submitted: 05/21/2018
Entry Accepted: 05/21/2018
Entry Last Modified: 05/21/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society