Optimization Online


Newton-Like Methods for Sparse Inverse Covariance Estimation

Peder Olsen(pederao***at***us.ibm.com)
Figen Oztoprak(figen***at***su.sabanciuniv.edu)
Jorge Nocedal(nocedal***at***eecs.northwestern.edu)
Stephen Rennie(sjrennie***at***us.ibm.com)

Abstract: We propose two classes of second-order optimization methods for solving the sparse inverse covariance estimation problem. The first approach, which we call the Newton-LASSO method, minimizes a piecewise quadratic model of the objective function at every iteration to generate a step. We employ the fast iterative shrinkage thresholding method (FISTA) to solve this subproblem. The second approach, which we call the Orthant-Based Newton method, is a two-phase algorithm that first identifies an orthant face and then minimizes a smooth quadratic approximation of the objective function using the conjugate gradient method. These methods exploit the structure of the Hessian to efficiently compute the search direction and to avoid explicitly storing the Hessian. We show that quasi-Newton methods are also effective in this context, and describe a limited memory BFGS variant of the orthant-based Newton method. We present numerical results that suggest that all the techniques described in this paper have attractive properties and constitute useful tools for solving the sparse inverse covariance estimation problem. Comparisons with the method implemented in the QUIC software package [1] are presented.

Keywords: convex optimization, machine learning, statistics

Category 1: Convex and Nonsmooth Optimization (Convex Optimization )

Citation: Tech Report, Optimization Center, Northwestern University, June, 2012

Download: [PDF]

Entry Submitted: 06/13/2012
Entry Accepted: 06/13/2012
Entry Last Modified: 06/13/2012

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society