Optimization Online


Using Inexact Gradients in a Multilevel Optimization Algorithm

Michael Lewis (rmlewi***at***wm.edu)
Stephen Nash (snash***at***gmu.edu)

Abstract: Many optimization algorithms require gradients of the model functions, but computing accurate gradients can be computationally expensive. We study the implications of using inexact gradients in the context of the multilevel optimization algorithm MGOpt. MGOpt recursively uses (typically cheaper) coarse models to obtain search directions for finer-level models. However, MGOpt requires the gradient on the fine level to define the recursion. Our primary focus here is the impact of the gradient errors on the multilevel recursion. We analyze, partly through model problems, how MGOpt is affected under various assumptions about the source of the error in the gradients, and demonstrate that in many cases the effect of the errors is benign. Computational experiments are included.

Keywords: multilevel optimization, inexact gradient evaluation, optimization-based multigrid, nonlinear optimization

Category 1: Nonlinear Optimization

Category 2: Nonlinear Optimization (Systems governed by Differential Equations Optimization )

Citation: Computational Optimization with Applications, volume 56 (2013), pp. 39-61.

Download: [PDF]

Entry Submitted: 04/25/2012
Entry Accepted: 04/25/2012
Entry Last Modified: 07/22/2013

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society