Optimization Online


Universal gradient methods for convex optimization problems

Yurii Nesterov (Yurii.Nesterov***at***uclouvain.be)

Abstract: In this paper, we present new methods for black-box convex minimization. They do not need to know in advance the actual level of smoothness of the objective function. Their only essential input parameter is the required accuracy of the solution. At the same time, for each particular problem class they automatically ensure the best possible rate of convergence. We confirm our theoretical results by encouraging numerical experiments, which demonstrate that the fast rate of convergence, typical for the smooth optimization problems, sometimes can be achieved even on nonsmooth problem instances.

Keywords: convex optimization, black-box methods, complexity bounds, optimal methods, weakly smooth functions

Category 1: Convex and Nonsmooth Optimization

Citation: CORE DP 2013/26 (April 2013)

Download: [PDF]

Entry Submitted: 04/18/2013
Entry Accepted: 04/18/2013
Entry Last Modified: 06/12/2013

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society