Optimization Online


Stochastic first order methods in smooth convex optimization.

Olivier Devolder(Olivier.Devolder***at***uclouvain.be)

Abstract: In this paper, we are interested in the development of efficient first-order methods for convex optimization problems in the simultaneous presence of smoothness of the objective function and stochasticity in the first-order information. First, we consider the Stochastic Primal Gradient method, which is nothing else but the Mirror Descent SA method applied to a smooth function and we develop new practical and efficient stepsizes policies. Based on the machinery of estimates sequences functions, we develop also two new methods, a Stochastic Dual Gradient Method and an accelerated Stochastic Fast Gradient Method. Convergence rates on average, probabilities of large deviations and accuracy certificates are studied. All of these methods are designed in order to decrease the effect of the stochastic noise at an unimprovable rate and to be easily implementable in practice (the practical efficiency of our method is confirmed by numerical experiments). Furthermore, the biased case, when the oracle is not only stochastic but also affected by a bias is considered for the first time in the literature.

Keywords: Stochastic Optimization, Stochastic Approximation Methods, Smooth Convex Optimization, First-order Methods, Fast Gradient Method, Complexity Bounds, Probability of Large Deviations.

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Stochastic Programming

Citation: CORE Discusssion Paper 2011/70, Université catholique de Louvain, Belgium

Download: [PDF]

Entry Submitted: 01/19/2012
Entry Accepted: 01/19/2012
Entry Last Modified: 01/19/2012

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society