-

 

 

 




Optimization Online





 

New analysis of linear convergence of gradient-type methods via unifying error bound conditions

Zhang Hui (h.zhang1984***at***163.com)

Abstract: The subject of linear convergence of gradient-type methods on non-strongly convex optimization has been widely studied by introducing several notions as sufficient conditions. Influential examples include the error bound property, the restricted strongly convex property, the quadratic growth property, and the Kurdyka-{\L}ojasiewicz property. In this paper, we first define a group of error bound conditions in a unified way, which covers all of the influential sufficient conditions for linear convergence analysis. Then, we establish a comprehensive relationship between the unified error bound conditions. To the best of our knowledge, this is the first study to show the very close connection between all of the listed sufficient conditions. With the unified error bound conditions, which are strictly weaker than the strongly convex condition---a traditional tool used as a linear convergence guarantee, we provide a new linear convergence analysis for a number of fundamental algorithms, including the proximal point algorithm, the dual gradient algorithms, and the cyclic block coordinate gradient descent method. Finally, by introducing a composition error bound condition, we obtain a Q-linear convergence of the gradient descent with Nesterov's acceleration on a new class of functions for which such convergence was not shown before.

Keywords: gradient descent, linear convergence, error bound condition, proximal point algorithm, dual gradient algorithm, cyclic block coordinate gradient descent, Nesterov's acceleration

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Citation: arXiv:1606.00269

Download: [PDF]

Entry Submitted: 08/17/2016
Entry Accepted: 08/17/2016
Entry Last Modified: 06/23/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society