Optimization Online


Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization

Ion Necoara(ion.necoara***at***acse.pub.ro)
Andrei Patrascu(andrei.patrascu***at***acse.pub.ro)

Abstract: In this paper we analyze several new methods for solving nonconvex optimization problems with the objective function formed as a sum of two terms: one is nonconvex and smooth, and another is convex but simple and its structure is known. Further, we consider both cases: unconstrained and linearly constrained nonconvex problems. For optimization problems of the above structure, we propose random coordinate descent algorithms and analyze their convergence properties. For the general case, when the objective function is nonconvex and composite we prove asymptotic convergence for the sequences generated by our algorithms to stationary points and sublinear rate of convergence in expectation for some optimality measure. Additionally, if the objective function satisfies an error bound condition we derive a local linear rate of convergence for the expected values of the objective function. We also present extensive numerical experiments for evaluating the performance of our algorithms in comparison with state-of-the-art methods.

Keywords: large scale nonconvex optimization; random coordinate descent algorithms; convergence analysis; rate of convergence

Category 1: Global Optimization

Category 2: Nonlinear Optimization

Category 3: Global Optimization (Theory )

Citation: A. Patrascu and I. Necoara, ``Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization'', Technical Report, UPB, submitted to a journal, 2013.

Download: [PDF]

Entry Submitted: 05/04/2013
Entry Accepted: 05/06/2013
Entry Last Modified: 05/04/2013

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society