Optimization Online


Alternating direction method of multipliers for sparse zero-variance discriminant analysis and principal component analysis

Brendan Ames(bpames***at***caltech.edu)
Mingyi Hong(mhong***at***umn.edu)

Abstract: We consider the task of classification in the high-dimensional setting where the number of features of the given data is significantly greater than the number of observations. To accomplish this task, we propose sparse zero-variance discriminant analysis (SZVD) as a method for simultaneouslyperforming linear discriminant analysis and feature selection on high-dimensional data. This method combines classical zero-variance discriminant analysis, where discriminant vectors are identified in the null space of the sample within-class covariance matrix, with penalization applied to the discriminant vectors to induce sparse solutions. We propose a simple algorithm based on the alternating direction method of multipliers for approximately solving the resulting nonconvex optimization problem. Further, we show that this algorithm is applicable to a larger class of penalized generalized eigenvalue problems, including a particular relaxation of the sparse principal component analysis problem. Theoretical guarantees for convergence of our algorithm to stationary points of the original nonconvex problem and the results of numerical experiments evaluating performance of our classification heuristic on simulated data and data drawn from applications in time-series classification are also provided.


Category 1: Applications -- Science and Engineering (Statistics )

Category 2: Applications -- Science and Engineering (Data-Mining )


Download: [PDF]

Entry Submitted: 01/22/2014
Entry Accepted: 01/22/2014
Entry Last Modified: 01/22/2014

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society