-

 

 

 




Optimization Online





 

An Accelerated Communication-Efficient Primal-Dual Optimization Framework for Structured Machine Learning

Chenxin Ma(chm514***at***lehigh.edu)
Martin Jaggi(martin.jaggi***at***epfl.ch)
Frank E Curtis(frank.e.curtis***at***gmail.com)
Nathan Srebro(nati***at***ttic.edu)
Martin Takac(takac***at***lehigh.edu)

Abstract: Distributed optimization algorithms are essential for training machine learning models on very large-scale datasets. However, they often suffer from communication bottlenecks. Confronting this issue, a communication-efficient primal-dual coordinate ascent framework (CoCoA) and its improved variant CoCoA+ have been proposed, achieving a convergence rate of $\mathcal{O}(1/t)$ for solving empirical risk minimization problems with Lipschitz continuous losses. In this paper, an accelerated variant of CoCoA+ is proposed and shown to possess a convergence rate of $\mathcal{O}(1/t^2)$ in terms of reducing suboptimality. The analysis of this rate is also notable in that the convergence rate bounds involve constants that, except in extreme cases, are significantly reduced compared to those previously provided for CoCoA+. The results of numerical experiments are provided to show that acceleration can lead to significant performance gains.

Keywords:

Category 1: Nonlinear Optimization

Category 2: Optimization Software and Modeling Systems (Parallel Algorithms )

Citation:

Download: [PDF]

Entry Submitted: 11/14/2017
Entry Accepted: 11/14/2017
Entry Last Modified: 11/14/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society