-

 

 

 




Optimization Online





 

Gradient methods for minimizing composite objective function

Yurii Nesterov (nesterov***at***uclouvain.be)

Abstract: In this paper we analyze several new methods for solving optimization problems with the objective function formed as a sum of two convex terms: one is smooth and given by a black-box oracle, and another is general but simple and its structure is known. Despite to the bad properties of the sum, such problems, both in convex and nonconvex cases, can be solved with efficiency typical for the good part of the objective. For convex problems of the above structure, we consider primal and dual variants of the gradient method (converge as $O\left({1 \over k}\right)$), and an accelerated multistep version with convergence rate $O\left({1 \over k^2}\right)$, where $k$ is the iteration counter. For all methods, we suggest some efficient ``line search'' procedures and show that the additional computational work necessary for estimating the unknown problem class parameters can only multiply the complexity of each iteration by a small constant factor. We present also the results of preliminary computational experiments, which confirm the superiority of the accelerated scheme.

Keywords: Local Optimization, Convex Optimization, Nonsmooth optimization, Complexity theory, Optimal methods, Structural Optimization, $l_1$-regularization

Category 1: Convex and Nonsmooth Optimization

Citation: CORE Discussion Paper 2007/76 September 2007

Download: [PDF]

Entry Submitted: 09/20/2007
Entry Accepted: 09/20/2007
Entry Last Modified: 12/14/2007

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society