Estimate sequence methods: extensions and approximations
Abstract: The approach of estimate sequence offers an interesting rereading of a number of accelerating schemes proposed by Nesterov. It seems to us that this framework is the most appropriate descriptive framework to develop an analysis of the sensitivity of the schemes to approximations. We develop in this work a simple, self-contained, and unified framework for the study of estimate sequences, with which we can recover some accelerating scheme proposed by Nesterov, notably the acceleration procedure for constrained cubic regularization in convex optimization, and obtain easily generalizations to regularization schemes of any order. We analyze carefully the sensitivity of these algorithms to various types of approximations: partial resolution of subproblems, use of approximate subgradients, or both, and draw some guidelines on the design of further estimate sequence schemes.
Keywords: Convex optimization, gradient schemes, approximations
Category 1: Convex and Nonsmooth Optimization (Convex Optimization )
Citation: IFOR Internal report, August 2009, ETH Zurich, Raemistrasse 101, CH-8092 Zurich, Switzerland.
Entry Submitted: 08/11/2009
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|