-

 

 

 




Optimization Online





 

Trust-Region Algorithms for Training Responses: Machine Learning Methods Using Indefinite Hessian Approximations

Jennifer Erway (erwayjb***at***wfu.edu)
Joshua Griffin (Joshua.Griffin***at***sas.com)
Riadh Omheni (Riadh.Omheni***at***sas.com)
Roummel Marcia (rmarcia***at***ucmerced.edu)

Abstract: Machine learning (ML) problems are often posed as highly nonlinear and nonconvex unconstrained optimization problems. Methods for solving ML problems based on stochastic gradient descent are easily scaled for very large problems but may involve fine-tuning many hyper-parameters. Quasi-Newton approaches based on the limited-memory Broyden-Fletcher-Goldfarb-Shanno (BFGS) update typically do not require manually tuning hyper-parameters but suffer from approximating a potentially indefinite Hessian with a positive-definite matrix. Hessian-free methods leverage the ability to perform Hessian-vector multiplication without needing the entire Hessian matrix, but each iteration's complexity is significantly greater than quasi-Newton methods. In this paper we propose an alternative approach for solving ML problems based on a quasi-Newton trust-region framework for solving large-scale optimization problems that allow for indefinite Hessian approximations. Numerical experiments on a standard testing data set show that with a fixed computational time budget, the proposed methods achieve better results than the traditional limited-memory BFGS and the Hessian-free methods. \end{abstract}

Keywords: Nonconvex optimization, quasi-Newton methods, symmetric-rank one update, machine learning

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Category 2: Applications -- Science and Engineering (Other )

Citation: Technical report 2017-2, Department of Mathematics and Statistics, Wake Forest University (2017)

Download: [PDF]

Entry Submitted: 10/26/2017
Entry Accepted: 10/26/2017
Entry Last Modified: 05/22/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society