-

 

 

 




Optimization Online





 

Exploiting Negative Curvature in Deterministic and Stochastic Optimization

Frank E. Curtis (frank.e.curtis***at***gmail.com)
Daniel Robinson (daniel.p.robinson***at***gmail.com)

Abstract: This paper addresses the question of whether it can be beneficial for an optimization algorithm to follow directions of negative curvature. Although some prior work has established convergence results for algorithms that integrate both descent and negative curvature directions, there has not yet been numerical evidence showing that such methods offer significant performance improvements. In this paper, we present new frameworks for combining descent and negative curvature directions: alternating two-step approaches and dynamic step approaches. A unique aspect of each of our frameworks is that fixed stepsizes can be used (rather than line searches or trust regions), which makes the methods viable for both deterministic and stochastic settings. For deterministic problems, we show that our dynamic framework yields significant gains in performance (in terms of lower objective function values in a fixed number of iterations) compared to a gradient descent method. We also show that while the gains offered in a stochastic setting might be more modest, they can be notable.

Keywords: stochastic optimization, unconstrained optimization, negative curvature, deep neural networks

Category 1: Stochastic Programming

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Citation: Lehigh/COR@L Technical Report 17T-003, Lehigh University, 02/2017

Download: [PDF]

Entry Submitted: 02/28/2017
Entry Accepted: 02/28/2017
Entry Last Modified: 10/15/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society