-

 

 

 




Optimization Online





 

Convergence rates of accelerated proximal gradient algorithms under independent noise

Sun Tao(nudtsuntao***at***163.com)
Barrio Roberto (rbarrio***at***unizar.es)
Jiang Hao (haojiang***at***nudt.edu.cn)
Cheng Lizhi ( clzcheng***at***nudt.edu.cn)

Abstract: We consider an accelerated proximal gradient algorithm for the composite optimization with “independent errors” (errors little related with historical information) for solving linear inverse problems. We present a new inexact version of FISTA algorithm considering deterministic and stochastic noises. We prove some convergence rates of the algorithm and we connect it with the current existing catalyst framework for many algorithms in machine learning. We show that a catalyst can be regarded as a special case of the FISTA algorithm where the smooth part of the function vanishes. Our framework gives a more generic formulation that provides convergence results for the deterministic and stochastic noise cases and also to the catalyst framework. Some of our results provide simpler alternative analysis of some existing results in literature, but they also extend the results to more generic situations.

Keywords: Convergence rates, inexact, acceleration

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Nonlinear Optimization

Citation:

Download: [PDF]

Entry Submitted: 10/24/2017
Entry Accepted: 10/24/2017
Entry Last Modified: 10/24/2017

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society