-

 

 

 




Optimization Online





 

Convergence Rate Analysis of a Stochastic Trust Region Method via Supermartingales

Jose Blanchet(jose.blanchet***at***columbia.edu)
Coralia Cartis(cartis***at***maths.ox.ac.uk)
Matt Menickelly(mmenickelly***at***anl.gov)
Katya Scheinberg(katyas***at***lehigh.edu)

Abstract: We propose a novel framework for analyzing convergence rates of stochastic optimization algorithms with adaptive step sizes. This framework is based on analysing properties of an underlying generic stochastic process, in particular by deriving a bound on the expected stopping time of this process. We utilise this framework to analyse the bounds on expected global convergence rates of a stochastic variant of a traditional trust region method, introduced in (Chen, Menickelly, Scheinberg, 2014). While traditional trust region methods rely on exact computations of the gradient, Hessian and values of the objective function, this method assumes that these values are available up to some dynamically adjusted accuracy. Moreover, this accuracy is assumed to hold only with some sufficiently large, but fixed, probability, without any additional restrictions on the variance of the errors. This setting applies, for example, to standard stochastic optimization and machine learning formulations. Improving upon the analysis in (Chen et al, 2014), we show that the stochastic process defined by the algorithm satisfies the assumptions of our proposed general framework, with the stopping time defined as reaching accuracy $\|\nabla f(x)\|\leq \epsilon$. The resulting bound for this stopping time is $O(\epsilon^{-2})$, under the assumption of sufficiently accurate stochastic gradient, and is the first global complexity bound for a stochastic trust-region method. Finally, we apply the same analytic framework to derive second order complexity bounds for a second-order algorithmic variant, under an additional assumption on the estimates.

Keywords: stochastic methods, trust region, global rates of convergence

Category 1: Nonlinear Optimization

Citation: Technical Report, Oxford University, 2018

Download: [PDF]

Entry Submitted: 10/19/2018
Entry Accepted: 10/19/2018
Entry Last Modified: 10/19/2018

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society