Optimization Online


Stochastic model-based minimization of weakly convex functions

Damek Davis(dsd95***at***cornell.edu)
Dmitriy Drusvyatskiy(ddrusv***at***uw.edu)

Abstract: We consider an algorithm that successively samples and minimizes stochastic models of the objective function. We show that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate $O(k^{-1/4})$. Our result yields the first complexity guarantees for the stochastic proximal point algorithm on weakly convex problems and for the stochastic prox-linear algorithm for minimizing compositions of convex functions with smooth maps. Our general framework also recovers the recently obtained complexity estimate for the stochastic proximal subgradient method on weakly convex problems.

Keywords: stochastic, proximal point method, prox-linear method, subgradient method, Moreau envelope, weakly convex

Category 1: Convex and Nonsmooth Optimization

Citation: arXiv:1803.06523

Download: [PDF]

Entry Submitted: 03/29/2018
Entry Accepted: 03/29/2018
Entry Last Modified: 03/29/2018

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society