-

 

 

 




Optimization Online





 

Accelerated Proximal Stochastic Dual Coordinate Ascent for Regularized Loss Minimization

Shai Shalev-Shwartz (shais***at***cs.huji.ac.il)
Tong Zhang (tzhang***at***stat.rutgers.edu)

Abstract: We introduce a proximal version of the stochastic dual coordinate ascent method and show how to accelerate the method using an inner-outer iteration procedure. We analyze the runtime of the framework and obtain rates that improve state-of-the-art results for various key machine learning optimization problems including SVM, logistic regression, ridge regression, Lasso, and multiclass SVM. Experiments validate our theoretical findings.

Keywords: Acceleration, proximal methods, coordinate ascent, randomized algorithms, regularized loss minimization

Category 1: Convex and Nonsmooth Optimization

Category 2: Global Optimization (Stochastic Approaches )

Category 3: Applications -- Science and Engineering (Statistics )

Citation:

Download: [PDF]

Entry Submitted: 09/10/2013
Entry Accepted: 09/10/2013
Entry Last Modified: 10/08/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society