Optimization Online


An incremental mirror descent subgradient algorithm with random sweeping and proximal step

Radu Ioan Bot(radu.bot***at***univie.ac.at)
Axel Böhm(axel.boehm***at***univie.ac.at)

Abstract: We investigate the convergence properties of incremental mirror descent type subgradient algorithms for minimizing the sum of convex functions. In each step we only evaluate the subgradient of a single component function and mirror it back to the feasible domain, which makes iterations very cheap to compute. The analysis is made for a randomized selection of the component functions, which yields the deterministic algorithm as a special case. Under supplementary differentiability assumptions on the function which induces the mirror map we are also able to deal with the presence of another term in the objective function, which is evaluated via a proximal type step. In both cases we derive convergence rates of $\mathcal{O} \left(\frac{1}{\sqrt{k}} \right)$ in expectation for the $k$th best objective function value and illustrate our theoretical findings by numerical experiments in positron emission tomography and machine learning.

Keywords: nonsmooth convex minimization; incremental mirror descent algorithm; global rate of convergence; random sweeping

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Stochastic Programming


Download: [PDF]

Entry Submitted: 09/17/2017
Entry Accepted: 09/18/2017
Entry Last Modified: 09/17/2017

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society