Convex Optimization Methods for Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression

In this paper, we study convex optimization methods for computing the trace norm regularized least squares estimate in multivariate linear regression. The so-called factor estimation and selection (FES) method, recently proposed by Yuan et al. [17], conducts parameter estimation and factor selection simultaneously and have been shown to enjoy nice properties in both large and finite samples. To compute the estimates, however, can be very challenging in practice because of the high dimensionality and the trace norm constraint. In this paper, we explore Nesterov's first-order methods [12, 13] and interior point methods for computing the penalized least squares estimate. The performance of these methods is then compared using a set of randomly generated instances. We show that the best of Nesterov's first-order methods substantially outperforms the interior point method implemented in SDPT3 version 4.0 (beta) [15]. Moreover, the former method is much more memory efficient.

Citation

Manuscript, Department of Mathematics, Simon Fraser University, 8888 University Drive, Burnaby, BC, V5A 1S6, Canada, January 2008

Article

Download

View Convex Optimization Methods for Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression