Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
Abstract: In this paper we study new stochastic approximation (SA) type algorithms, namely, the accelerated SA (AC-SA), for solving strongly convex stochastic composite optimization (SCO) problems. Specifically, by introducing a domain shrinking procedure, we significantly improve the large-deviation results associated with the convergence rate of a nearly optimal AC-SA algorithm presented by the authors. Moreover, we introduce a multi-stage AC-SA algorithm, which possesses an optimal rate of convergence for solving strongly convex SCO problems in terms of the dependence on, not only the target accuracy, but also a number of problem parameters and the selection of initial points. To the best of out knowledge, this is the first time that such an optimal method has been presented in the literature. From our computational results, these AC-SA algorithms can substantially outperform the classic SA and some other SA type algorithms for solving certain classes of strongly convex SCO problems.
Keywords: stochastic approximation, convex optimization, strong convexity, complexity, optimal method, large deviation
Category 1: Convex and Nonsmooth Optimization (Convex Optimization )
Category 2: Stochastic Programming
Category 3: Nonlinear Optimization
Citation: Technical Report, Department of Industrial and Systems Engineering, University of Florida.
Entry Submitted: 06/18/2012
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|