-

 

 

 




Optimization Online





 

A simple Newton method for local nonsmooth optimization

Adrian Lewis(adrian.lewis***at***cornell.edu)
Calvin Wylie(cjw278***at***cornell.edu)

Abstract: Superlinear convergence has been an elusive goal for black-box nonsmooth optimization. Even in the convex case, the subgradient method is very slow, and while some cutting plane algorithms, including traditional bundle methods, are popular in practice, local convergence is still sluggish. Faster variants depend either on problem structure or on analyses that elide sequences of "null" steps. Motivated by a semi-structured approach to optimization and the sequential quadratic programming philosophy, we describe a new bundle Newton method that incorporates second-order objective information with the usual linear approximation oracle. One representative problem class consists of maxima of several smooth functions, individually inaccessible to the oracle. Given as additional input just the cardinality of the optimal active set, we prove local quadratic convergence. A simple implementation shows promise on more general functions, both convex and nonconvex, and suggests first-order analogues.

Keywords: nonsmooth optimization, bundle method, Newton method, sequential quadratic programming, nonconvex

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: ORIE Cornell, July 2019

Download: [PDF]

Entry Submitted: 07/26/2019
Entry Accepted: 07/26/2019
Entry Last Modified: 07/26/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society