-

 

 

 




Optimization Online





 

A Family of Second-Order Methods for Convex L1-Regularized Optimization

Richard H Byrd(richard.byrd***at***colorado.edu)
Gillian M Chin(gillian.chin***at***u.northwestern.edu)
Jorge Nocedal(nocedal***at***eecs.northwestern.edu)
Figen Oztoprak(figen***at***sabanciuniv.edu)

Abstract: This paper is concerned with the minimization of an objective that is the sum of a convex function $f$ and an $\ell_1$ regularization term. Our interest is in methods that incorporate second-order information about the function $f$ to accelerate convergence. We describe a semi-smooth Newton framework that can be used to generate a variety of second-order methods, including block active-set methods, orthant-based methods and a second-order iterative soft-thresholding method. We also propose a new active set method that performs multiple changes in the active manifold estimate, and incorporates a novel mechanism for correcting estimates, when needed. This corrective mechanism is also evaluated in an orthant-based method. Numerical tests comparing the performance of several second-order methods are presented.

Keywords: semi-smooth Newton, L1-regularization, quadratic programming

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Citation: Unpublished: Optimization Center: Northwestern University, Tech Report, June 2012

Download: [PDF]

Entry Submitted: 06/26/2012
Entry Accepted: 06/26/2012
Entry Last Modified: 06/26/2012

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society