Optimization Online



Clóvis C. Gonzaga(clovis***at***mtm.ufsc.br)
Elizabeth W. Karas(ewkaras***at***gmail.com)
Diane R. Rossetto(dianerr***at***ime.usp.br)

Abstract: We describe three algorithms for solving differentiable convex optimization problems constrained to simple sets in $ \R^n $, i.e., sets on which it is easy to project an arbitrary point. The first two algorithms are optimal in the sense that they achieve an absolute precision of $ \varepsilon $ in relation to the optimal value in $ O(1/\sqrt\varepsilon) $ iterations using only first order information. This complexity depends on a Lipschitz constant $ L $ for the function derivatives and on a strong convexity constant $ \mu\geq 0 $. The first algorithm extends to the constrained case a well-known method devised by Nesterov \cite{nesterov1} for unconstrained problems, and includes a procedure guessing the unknown value of $ L $. The complexity analysis follows a simpler geometric approach. The other algorithms have several enhancements, including line searches that improve the performance: the second algorithm is enhanced and optimal; the third relaxes somewhat the optimality to obtain the best practical performance. Numerical tests for box-constrained quadratic problems are presented in the last section.

Keywords: optimal algorithms, convex problems

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Technical Report, Federal University of Paraná, Dep. of Mathematics, Brazil, June, 2011.

Download: [PDF]

Entry Submitted: 06/06/2011
Entry Accepted: 06/06/2011
Entry Last Modified: 06/06/2011

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society