Strong local convergence properties of adaptive regularized methods for nonlinear least-squares
Stefania Bellavia (stefania.bellaviaunifi.it)
Abstract: This paper studies adaptive regularized methods for nonlinear least-squares problems where the model of the objective function used at each iteration is either the Euclidean residual regularized by a quadratic term or the Gauss-Newton model regularized by a cubic term. For suitable choices of the regularization parameter the role of the regularization term is to provide global convergence. In this paper we investigate the impact of the regularization term on the local convergence rate of the methods and establish that, under the well-known error bound condition, quadratic convergence to zero-residual solutions is enforced. This result extends the existing analysis on the local convergence properties of adaptive regularized methods. In fact, the known results were derived under the standard full rank condition on the Jacobian at a zero-residual solution while the error-bound condition is weaker than the full rank condition and allows the solution set to be locally nonunique.
Keywords: Nonlinear least-squares problems, regularized models, error-bound condition, quadratic convergence.
Category 1: Nonlinear Optimization (Nonlinear Systems and Least-Squares )
Citation: Technical Report n. 1/2013, Dipartimento di Ingegneria Industriale, Universita di Firenze
Entry Submitted: 02/07/2013
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|