Optimization Online


Efficiency of minimizing compositions of convex functions and smooth maps

Dmitriy Drusvyatskiy(ddrusv***at***uw.edu)
Courtney Paquette(yumiko88***at***u.washington.edu)

Abstract: We consider the problem of minimizing a sum of a convex function and a composition of a convex function with a smooth map. Important examples include exact penalty formulations of nonlinear programs and nonlinear least squares problems with side constraints. The basic algorithm we rely on is the well-known prox-linear method, which in each iteration solves a regularized subproblem formed by linearizing the smooth map. When the subproblems are solved exactly, the method has the efficiency guarantee $O(\epsilon^{-2})$, akin to gradient descent for smooth minimization. Our contributions are threefold: we (1) derive an inertial prox-linear method that accelerates in presence of convexity, (2) quantify the impact of inexact subproblem solves, and (3) discuss efficiency of smoothing techniques. When the subproblems can only be solved by first-order methods, we show that a simple combination of the prox-linear method, smoothing, and fast-gradient subproblem solves yields a scheme with overall efficiency $O(\epsilon^{-3})$, up to log factors. This appears to be the best known complexity bound for the problem class among first-order methods.

Keywords: Composite minimization, fast gradient methods, Gauss-Newton, prox-gradient, inexactness, complexity, smoothing

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Citation: 12/2016

Download: [PDF]

Entry Submitted: 12/21/2016
Entry Accepted: 12/22/2016
Entry Last Modified: 12/21/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society