-

 

 

 




Optimization Online





 

Variable smoothing for convex optimization problems using stochastic gradients

Radu Ioan Bot(radu.bot***at***univie.ac.at)
Axel Böhm(axel.boehm***at***univie.ac.at)

Abstract: We aim to solve a structured convex optimization problem, where a nonsmooth function is composed with a linear operator. When opting for full splitting schemes, usually, primal-dual type methods are employed as they are effective and also well studied. However, under the additional assumption of Lipschitz continuity of the nonsmooth function which is composed with the linear operator we can derive novel algorithms through regularization via the Moreau envelope. Furthermore, we tackle large scale problems by means of stochastic oracle calls, very similar to stochastic gradient techniques. Applications to total variational denoising and deblurring are provided.

Keywords: structured convex optimization problem, variable smoothing algorithm, convergence rate, stochastic gradients

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Category 2: Stochastic Programming

Citation:

Download: [PDF]

Entry Submitted: 05/16/2019
Entry Accepted: 05/16/2019
Entry Last Modified: 05/16/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society