-

 

 

 




Optimization Online





 

A stochastic Levenberg-Marquardt method using random models with complexity results.

E. Bergou (elhoucine.bergou***at***inra.fr)
Y. Diouane (youssef.diouane***at***isae.fr)
V. Kungurtsev (vyacheslav.kungurtsev***at***fel.cvut.cz)
C. Royer (croyer2***at***wisc.edu)

Abstract: Globally convergent variants of the Gauss-Newton algorithm are often the methods of choice to tackle nonlinear least-squares problems. Among such frameworks, Levenberg-Marquardt and trust-region methods are two well-established, similar paradigms. Both schemes have been studied when the Gauss-Newton model is replaced by a random model that is only accurate with a given probability. Trust-region schemes have also been applied to problems where the objective value is subject to noise: this setting is of particular interest in fields such as data assimilation, where efficient methods that can adapt to noise are needed to account for the intrinsic uncertainty in the input data. In this paper, we describe a stochastic Levenberg-Marquardt algorithm that handles noisy objective function values and random models, provided sufficient accuracy is achieved in probability. Our method relies on a specific scaling of the regularization parameter, that allows us to leverage existing results for trust-region algorithms. Moreover, we exploit the structure of our objective through the use of a family of stationarity criteria tailored to least-squares problems. Providedthe probability of accurate function estimates and models is sufficiently large, we bound the expected number of iterations needed to reach an approximate stationary point, which generalizes results based on using deterministic models or noiseless function values. We illustrate the link between our approach and several applications related to inverse problems and machine learning.

Keywords: Levenberg-Marquardt method, nonlinear least squares, regularization, random models, noisy functions, data assimilation.

Category 1: Nonlinear Optimization (Nonlinear Systems and Least-Squares )

Category 2: Nonlinear Optimization (Unconstrained Optimization )

Citation: To appear in SIAM/ASA J. Uncertain. Quantif., 2021

Download: [PDF]

Entry Submitted: 07/05/2018
Entry Accepted: 07/05/2018
Entry Last Modified: 11/18/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society