-

 

 

 




Optimization Online





 

Worst case complexity of direct search under convexity

M. Dodangeh(dodangeh***at***mat.uc.pt)
L. N. Vicente(lnv***at***mat.uc.pt)

Abstract: In this paper we prove that the broad class of direct-search methods of directional type, based on imposing sufficient decrease to accept new iterates, exhibits the same global rate or worst case complexity bound of the gradient method for the unconstrained minimization of a convex and smooth function. More precisely, it will be shown that the number of iterations needed to reduce the norm of the gradient of the objective function below a certain threshold is at most proportional to the inverse of the threshold. Our result is slightly less general than Nesterov's for the gradient method, in the sense that we require more than just convexity of the objective function and boundedness of the initial iterate to the solution set. Our additional condition can, however, be satisfied in several scenarios, such as strong or uniform convexity, boundedness of the initial level set, or boundedness of the distance from the initial contour set to the solution set. It is a mild price to pay for deriving such a global rate for zero-order methods.

Keywords: Derivative-free optimization, direct search, worst case complexity, sufficient decrease, convexity.

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Citation: Preprint 13-10, Dept. Mathematics, Univ. Coimbra.

Download: [PDF]

Entry Submitted: 04/23/2013
Entry Accepted: 04/23/2013
Entry Last Modified: 04/23/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society