-

 

 

 




Optimization Online





 

On the convergence of a regularized Jacobi algorithm for convex optimization

Goran Banjac (goran.banjac***at***eng.ox.ac.uk)
Kostas Margellos (kostas.margellos***at***eng.ox.ac.uk)
Paul J. Goulart (paul.goulart***at***eng.ox.ac.uk)

Abstract: In this paper we consider the regularized version of the Jacobi algorithm, a block coordinate descent method for convex optimization with differentiable objective function and block-separable constraints that has been recently proposed in the literature. Under certain regularity assumptions on the objective function, this algorithm has been shown to satisfy the so-called sufficient decrease condition, and consequently to converge in objective function value. In this paper we revisit the convergence analysis of the regularized Jacobi algorithm and show that it also converges in iterates under very mild conditions on the objective function. Moreover, we establish conditions under which the algorithm achieves a linear convergence rate.

Keywords: Decentralized optimization, Jacobi algorithm, block coordinate descent methods, linear convergence

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation:

Download: [PDF]

Entry Submitted: 09/16/2016
Entry Accepted: 09/16/2016
Entry Last Modified: 08/04/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society