-

 

 

 




Optimization Online





 

Gradient Descent only Converges to Minimizers

Jason D. Lee(jasondlee88***at***gmail.com)
Max Simchowitz(msimchowitz***at***berkeley.edu)
Michael I. Jordan(jordan***at***cs.berkeley.edu)
Benjamin Recht(brecht***at***berkeley.edu)

Abstract: We show that gradient descent converges to a local minimizer, almost surely with random initialization. This is proved by applying the Stable Manifold Theorem from dynamical systems theory.

Keywords: Gradient descent, non-convex, saddle points, local minimizer,

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation:

Download: [PDF]

Entry Submitted: 02/17/2016
Entry Accepted: 02/17/2016
Entry Last Modified: 02/17/2016

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society