-

 

 

 




Optimization Online





 

Linear Convergence of Gradient and Proximal-Gradient Methods Under the Polyak-Lojasiewicz Condition

Hamed Karimi (hamedkarim***at***gmail.com)
Julie Nutini (jnutini***at***cs.ubc.ca)
Mark Schmidt (schmidtm***at***cs.ubc.ca)

Abstract: In 1963, Polyak proposed a simple condition that is sufficient to show a global linear convergence rate for gradient descent. This condition is a special case of the Lojasiewicz inequality proposed in the same year, and it does not require strong convexity (or even convexity). In this work, we show that this much-older Polyak-Lojasiewicz (PL) inequality is actually weaker than the main conditions that have been explored to show linear convergence rates without strong convexity over the last 25 years. We also use the PL inequality to give new analyses of randomized and greedy coordinate descent methods, sign-based gradient descent methods, and stochastic gradient methods in the classic setting (with decreasing or constant step-sizes) as well as the variance-reduced setting. We further propose a generalization that applies to proximal-gradient methods for non-smooth optimization, leading to simple proofs of linear convergence of these methods. Along the way, we give simple convergence results for a wide variety of problems in machine learning: least squares, logistic regression, boosting, resilient backpropagation, L1-regularization, support vector machines, stochastic dual coordinate ascent, and stochastic variance-reduced gradient methods.

Keywords: least squares, logistic regression, gradient descent, coordinate descent, stochastic gradient, variance reduction, boosting, support vector machines, L1-regularization

Category 1: Convex and Nonsmooth Optimization

Citation: To appear in Volume 9851 of the Lecture Notes in Computer Science series (ECML-PKDD 2016)

Download: [PDF]

Entry Submitted: 08/16/2016
Entry Accepted: 08/16/2016
Entry Last Modified: 10/20/2016

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society