-

 

 

 




Optimization Online





 

A Quasi-Newton Algorithm for Nonconvex, Nonsmooth Optimization with Global Convergence Guarantees

Frank E. Curtis (frank.e.curtis***at***gmail.com)
Xiaocun Que (xiq209***at***lehigh.edu)

Abstract: A line search algorithm for minimizing nonconvex and/or nonsmooth objective functions is presented. The algorithm is a hybrid between a standard Broyden–Fletcher–Goldfarb–Shanno (BFGS) and an adaptive gradient sampling (GS) method. The BFGS strategy is employed because it typically yields fast convergence to the vicinity of a stationary point, and together with the adaptive GS strategy the algorithm ensures that convergence will continue to such a point. Under suitable assumptions, it is proved that the algorithm converges globally with probability one. The algorithm has been implemented in C++ and the results of numerical experiments illustrate the efficacy of the proposed approach.

Keywords: nonsmooth optimization, nonconvex optimization, unconstrained optimization, quasi-Newton methods, gradient sampling, line search methods

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Category 3: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: F. E. Curtis and X. Que. A Quasi-Newton Algorithm for Nonconvex, Nonsmooth Optimization with Global Convergence Guarantees. Mathematical Programming Computation, 7(4):399-428, 2015.

Download:

Entry Submitted: 05/31/2014
Entry Accepted: 05/31/2014
Entry Last Modified: 01/02/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society