-

 

 

 




Optimization Online





 

Block Coordinate Descent Almost Surely Converges to a Stationary Point Satisfying the Second-order Necessary Condition

Enbin Song (e.b.song***at***163.com)
Zhubin Shen (zhubin_shen***at***126.com)
Qingjiang Shi (qing.j.shi***at***gmail.com)

Abstract: Given a non-convex twice continuously differentiable cost function with Lipschitz continuous gradient, we prove that all of the block coordinate gradient descent, block mirror descent and proximal block coordinate descent methods converge to stationary points satisfying the second-order necessary condition, almost surely with random initialization. All our results are ascribed to the center-stable manifold theorem and Ostrowski's lemma.

Keywords: Block coordinate gradient descent, Block mirror descent, Proximal block coordinate descent, Saddle points, Local minimum, Non-convex

Category 1: Nonlinear Optimization (Unconstrained Optimization )

Citation:

Download: [PDF]

Entry Submitted: 11/24/2017
Entry Accepted: 11/24/2017
Entry Last Modified: 11/25/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society