-

 

 

 




Optimization Online





 

Non-linear conjugate gradient methods for vector optimization

L. R. Lucambio Pérez(lrlucambioperez***at***gmail.com)
L. F. Prudente(lfprudente***at***gmail.com)

Abstract: In this work, we propose non-linear conjugate gradient methods for finding critical points of vector-valued functions with respect to the partial order induced by a closed, convex, and pointed cone with non-empty interior. No convexity assumption is made on the objectives. The concepts of Wolfe and Zoutendjik conditions are extended for the vector-valued optimization. In particular, we show that there exist intervals of stepsizes satisfying the Wolfe-type conditions. The convergence analysis covers the vector extensions of the Fletcher-Reeves, Conjugate Descent, Dai-Yuan, Polak-Ribière-Polyak, and Hestenes-Stiefel parameters that retrieve the classical ones in the scalar minimization case. Under inexact line searches and without regular restarts, we prove that the sequences generated by the proposed methods find points that satisfy the first-order necessary condition for Pareto-optimality.

Keywords: Vector optimization; Pareto-optimality; conjugate gradient method; unconstrained optimization; line search algorithm; Wolfe conditions.

Category 1: Other Topics (Multi-Criteria Optimization )

Citation: L. R. Lucambio Pérez, L. F. Prudente, Non-linear conjugate gradient methods for vector optimization, preprint, Federal University of Goias, April 2017.

Download: [PDF]

Entry Submitted: 04/20/2017
Entry Accepted: 04/20/2017
Entry Last Modified: 04/20/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society