-

 

 

 




Optimization Online





 

Quadratic Regularization Methods with Finite-Difference Gradient Approximations

Geovani Grapiglia (geovani.grapiglia***at***uclouvain.be)

Abstract: This paper presents two quadratic regularization methods with finite-difference gradient approximations for smooth unconstrained optimization problems. One method is based on forward finite-difference gradients, while the other is based on central finite-difference gradients. In both methods, the accuracy of the gradient approximations and the regularization parameter in the quadratic models are jointly adjusted using a nonmonotone acceptance condition for the trial points. When the objective function is bounded from below and has Lipschitz continuous gradient, it is shown that the method based on forward finite-difference gradients needs at most $\mathcal{O}\left(n\epsilon^{-2}\right)$ function evaluations to generate a $\epsilon$-approximate stationary point, where $n$ is the problem dimension. Under the additional assumption that the Hessian of the objective is Lipschitz continuous, an evaluation complexity bound of the same order is proved for the method based on central finite-difference gradients. Numerical results are also presented. They confirm the theoretical findings and illustrate the relative efficient of the proposed methods.

Keywords: Nonconvex Optimization; Derivative-Free Methods; Finite-Differences; Worst-Case Complexity

Category 1: Nonlinear Optimization

Citation:

Download: [PDF]

Entry Submitted: 11/01/2021
Entry Accepted: 11/01/2021
Entry Last Modified: 11/02/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society