Optimization Online


OSGA: A fast subgradient algorithm with optimal complexity

Arnold Neumaier (Arnold.Neumaier***at***univie.ac.at)

Abstract: This paper presents an algorithm for approximately minimizing a convex function in simple, not necessarily bounded convex domains, assuming only that function values and subgradients are available. No global information about the objective function is needed apart from a strong convexity parameter (which can be put to zero if only convexity is known). The worst case number of iterations needed to achieve a given accuracy is independent of the dimension (which may be infinite) and - apart from a constant factor - best possible under a variety of smoothness assumptions on the objective function.

Keywords: complexity bound, convex optimization, optimal subgradient method, large-scale optimization, Nesterov's optimal method, nonsmooth optimization, optimal first-order method, smooth optimization, strongly convex

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 02/05/2014
Entry Accepted: 02/05/2014
Entry Last Modified: 02/05/2014

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society