-

 

 

 




Optimization Online





 

Acceleration of SVRG and Katyusha X by Inexact Preconditioning

Yanli Liu(yanli.young.liu***at***gmail.com)
Fei Feng(fei.feng***at***math.ucla.edu)
Wotao Yin(wotaoyin***at***math.ucla.edu)

Abstract: Empirical risk minimization is an important class of optimization problems with many popular machine learning applications, and stochastic variance reduction methods are popular choices for solving them. Among these methods, SVRG and Katyusha X (a Nesterov accelerated SVRG) achieve fast convergence without substantial memory requirement. In this paper, we propose to accelerate these two algorithms by \textit{inexact preconditioning}, the proposed methods employ \textit{fixed} preconditioners, although the subproblem in each epoch becomes harder, it suffices to apply \textit{fixed} number of simple subroutines to solve it inexactly, without losing the overall convergence. As a result, this inexact preconditioning strategy gives provably better iteration complexity and gradient complexity over SVRG and Katyusha X. We also allow each function in the finite sum to be nonconvex while the sum is strongly convex. In our numerical experiments, we observe an on average $8\times$ speedup on the number of iterations and $7\times$ speedup on runtime.

Keywords: SVRG, Katyusha X, inexact preconditioning

Category 1: Stochastic Programming

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Citation:

Download: [PDF]

Entry Submitted: 05/23/2019
Entry Accepted: 05/23/2019
Entry Last Modified: 05/23/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society