-

 

 

 




Optimization Online





 

Convergence and evaluation-complexity analysis of a regularized tensor-Newton method for solving nonlinear least-squares problems subject to convex constraints

Nicholas Gould(nick.gould***at***stfc.ac.uk)
Tyrone Rees(tyrone.rees***at***stfc.ac.uk)
Jennifer Scott(jennifer.scott***at***stfc.ac.uk)

Abstract: Given a twice-continuously differentiable vector-valued function $r(x)$, a local minimizer of $\|r(x)\|_2$ within a convex set is sought. We propose and analyse tensor-Newton methods, in which $r(x)$ is replaced locally by its second-order Taylor approximation. Convergence is controlled by regularization of various orders. We establish global convergence to a constrained first-order critical point of $\|r(x)\|_2$, and provide function evaluation bounds that agree with the best-known bounds for methods using second derivatives. Numerical experiments comparing tensor-Newton methods with regularized Gauss-Newton and Newton methods demonstrate the practical performance of the newly proposed method in the unconstrained case.

Keywords: Nonlinear least squares, Tensor-Newton methods, Convergence analysis, Evaluation complexity analysis

Category 1: Nonlinear Optimization (Nonlinear Systems and Least-Squares )

Citation: Technical report RAL-TR-2019-001

Download: [PDF]

Entry Submitted: 01/17/2019
Entry Accepted: 01/17/2019
Entry Last Modified: 01/17/2019

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society