-

 

 

 




Optimization Online





 

On the extension of the Hager-Zhang conjugate gradient method for vector optimization

M. L. N. Gonçalves (maxlng***at***ufg.br)
L. F. Prudente (lfprudente***at***ufg.br)

Abstract: The extension of the Hager-Zhang (HZ) nonlinear conjugate gradient method for vector optimization is discussed in the present research. In the scalar minimization case, this method generates descent directions whenever, for example, the line search satisfies the standard Wolfe conditions. We first show that, in general, the direct extension of the HZ method for vector optimization does not yield descent (in the vector sense) even when an exact line search is employed. By using a sufficiently accurate line search, we then propose a self-adjusting HZ method which possesses the descent property. The proposed HZ method with suitable parameters reduces to the classical one in the scalar minimization case. Global convergence of the new scheme is proved without regular restarts and any convex assumption. Finally, numerical experiments illustrating the practical behavior of the approach are presented, and comparisons with the Hestenes-Stiefel conjugate gradient method are discussed.

Keywords: Vector optimization; Pareto-optimality; conjugate gradient method; Hager- Zhang conjugate gradient method; unconstrained optimization; line search algorithm; Wolfe conditions.

Category 1: Other Topics (Multi-Criteria Optimization )

Citation: M. L. N. Gonçalves and L. F. Prudente, On the extension of the Hager-Zhang conjugate gradient method for vector optimization, technical report, 2018.

Download:

Entry Submitted: 11/15/2018
Entry Accepted: 11/15/2018
Entry Last Modified: 11/23/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society