Optimization Online


A Coordinate Gradient Descent Method for Nonsmooth Separable Minimization

Paul Tseng(tseng***at***math.washington.edu)
Sangwoon Yun(sangwoon***at***math.washington.edu)

Abstract: We consider the problem of minimizing the sum of a smooth function and a separable convex function. This problem includes as special cases bound-constrained optimization and smooth optimization with l_1-regularization. We propose a (block) coordinate gradient descent method for solving this class of nonsmooth separable problems. We establish global convergence and, under a local Lipschitzian error bound assumption, linear convergence for this method. The local Lipschitzian error bound holds under assumptions analogous to those for constrained smooth optimization, e.g., the convex function is polyhedral and the smooth function is (nonconvex) quadratic or is the composition of a strongly convex function with a linear mapping. We report numerical experience with solving the l_1-regularization of unconstrained optimization problems from More et al. and from the CUTEr set. Comparison with L-BFGS-B and MINOS, applied to a reformulation of the l_1-regularized problem as a bound-constrained optimization problem, is also reported.

Keywords: Error bound -- Global convergence -- Linear convergence rate -- Nonsmooth optimization -- Coordinate descent

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization (Bound-constrained Optimization )

Citation: Report, Mathematics Department, University of Washington, Seattle; June 2006; revised April 2007; to appear in Math Programming, B.

Download: [PDF]

Entry Submitted: 04/25/2007
Entry Accepted: 04/25/2007
Entry Last Modified: 04/25/2007

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Programming Society