Optimization Online


An optimal subgradient algorithm with subspace search for costly convex optimization problems

Masoud Ahookhosh(masoud.ahookhosh***at***univie.ac.at)
Arnold Neumaier(Arnold.Neumaier***at***univie.ac.at)

Abstract: This paper presents an acceleration of the optimal subgradient algorithm OSGA \cite{NeuO} for solving convex optimization problems, where the objective function involves costly affine and cheap nonlinear terms. We combine OSGA with a multidimensional subspace search technique, which leads to low-dimensional problem that can be solved efficiently. Numerical results concerning some applications are reported. A software package implementing the new method is available.

Keywords: Convex optimization, Nonsmooth optimization, Subgradient methods, Multidimensional subspace search, Optimal complexity, First-order black-box information, Costly linear operator

Category 1: Convex and Nonsmooth Optimization

Category 2: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 3: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Faculty of Mathematics, University of Vienna

Download: [PDF]

Entry Submitted: 04/01/2015
Entry Accepted: 04/01/2015
Entry Last Modified: 04/01/2015

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society