-

 

 

 




Optimization Online





 

Efficiency of coordinate descent methods on huge-scale optimization problems

Yurii Nesterov (Yurii.Nesterov***at***uclouvain.be)

Abstract: In this paper we propose new methods for solving huge-scale optimization problems. For problems of this size, even the simplest full-dimensional vector operations are very expensive. Hence, we propose to apply an optimization technique based on random partial update of decision variables. For these methods, we prove the global estimates for the rate of convergence. Surprisingly enough, for certain classes of objective functions, our results are better than the standard worst-case bounds for deterministic algorithms. We present constrained and unconstrained versions of the method, and its accelerated variant. Our numerical test confirms a high efficiency of this technique on problems of very big size.

Keywords: Convex optimization, coordinate descent, worst-case efficiency estimates, fast gradient schemes, Google problem.

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: CORE Discussion Paper #2010/2, (January 2010)

Download: [PDF]

Entry Submitted: 01/22/2010
Entry Accepted: 01/22/2010
Entry Last Modified: 02/01/2010

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society