-

 

 

 




Optimization Online





 

A linearly convergent stochastic recursive gradient method for convex optimization

Yan Liu(liuyan23ucas***at***outlook.com)
Xiao Wang(wangxiao***at***ucas.ac.cn)
Tiande Guo(tdguo***at***ucas.ac.cn)

Abstract: The stochastic recursive gradient algorithm (SARAH) [8] attracts much interest recently. It admits a simple recursive framework for updating stochastic gradient estimates. Motivated by this, in this paper, we propose a SARAH-I method incorporating importance sampling, whose linear conver- gence rate of the sequence of distances between iterates and the optima set is proven under both strong convexity and non-strong convexity conditions. Fur- ther, we propose to use the Barzilai-Borwein (BB) method to automatically compute step sizes for SARAH-I, named as SARAH-I-BB, and we establish its convergence and complexity properties in different cases. Finally numerical tests are reported to indicate promising performances of SARAH-I-BB.

Keywords: Stochastic optimization, stochastic gradient, BB method, linear convergence rate, complexity

Category 1: Convex and Nonsmooth Optimization

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Citation:

Download: [PDF]

Entry Submitted: 04/24/2019
Entry Accepted: 04/25/2019
Entry Last Modified: 04/24/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society