-

 

 

 




Optimization Online





 

Parallel Algorithms for Big Data Optimization

Francisco Facchinei(facchinei***at***dis.uniroma1.it)
Simone Sagratella(sagratella***at***dis.uniroma1.it)
Gesualdo Scutari(gesualdo***at***buffalo.edu)

Abstract: We propose a decomposition framework for the parallel optimization of the sum of a differentiable function and a (block) separable nonsmooth, convex one. The latter term is usually employed to enforce structure in the solution, typically sparsity. Our framework is very flexible and includes both fully parallel Jacobi schemes and Gauss-Seidel (i.e., sequential) ones, as well as virtually all possibilities “in between” with only a subset of variables updated at each iteration. Our theoretical convergence results improve on existing ones, and numerical results on LASSO and logistic regression problems show that the new method consistently outperforms existing algorithms.

Keywords: Parallel optimization, Distributed methods, Jacobi method, LASSO, Sparse solution

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Optimization Software and Modeling Systems (Parallel Algorithms )

Category 3: Nonlinear Optimization

Citation:

Download: [PDF]

Entry Submitted: 04/02/2014
Entry Accepted: 04/02/2014
Entry Last Modified: 04/02/2014

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society