-

 

 

 




Optimization Online





 

Smooth minimization of nonsmooth functions with parallel coordinate descent methods

Olivier Fercoq(olivier.fercoq***at***ed.ac.uk)
Peter Richtarik(peter.richtarik***at***ed.ac.uk)

Abstract: We study the performance of a family of randomized parallel coordinate descent methods for minimizing the sum of a nonsmooth and separable convex functions. The problem class includes as a special case L1-regularized L1 regression and the minimization of the exponential loss (``AdaBoost problem''). We assume the input data defining the loss function is contained in a sparse $m \times n$ matrix $A$ with at most $\omega$ nonzeros in each row. Our methods need $O(n \beta / \tau)$ iterations to find an approximate solution with high probability, where $\tau$ is the number of processors and $\beta = 1 + (\omega-1)(\tau-1)/(n-1)$ for the fastest variant. The notation hides dependence on quantities such as the required accuracy and confidence levels and the distance of the starting iterate from an optimal point. Since $\beta / \tau$ is a decreasing function of $\tau$, the method needs fewer iterations when more processors are used. Certain variants of our algorithms perform on average only $O(\nnz(A) / n)$ arithmetic operations during a single iteration per processor and, because $\beta$ decreases when $\omega$ does, fewer iterations are needed for sparser problems.

Keywords: parallel coordinate descent, parallel adaboost, convex optimization, big data, separability

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Optimization Software and Modeling Systems (Parallel Algorithms )

Citation:

Download: [PDF]

Entry Submitted: 09/23/2013
Entry Accepted: 09/23/2013
Entry Last Modified: 09/23/2013

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society