Sufficient Conditions for Low-rank Matrix Recovery,Translated from Sparse Signal Recovery

The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is $NP$-hard and a popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we extend the concept of $s$-goodness for a sensing matrix in sparse signal recovery (proposed by Juditsky and Nemirovski [Math Program, 2011]) to linear transformations in LMR. Then, we give characterizations of $s$-goodness in the context of LMR. Using the two characteristic $s$-goodness constants, ${\gamma}_s$ and $\hat{\gamma}_s$, of a linear transformation, not only do we derive necessary and sufficient conditions for a linear transformation to be $s$-good, but also provide sufficient conditions for exact and stable $s$-rank matrix recovery via the nuclear norm minimization under mild assumptions. Moreover, we give computable upper bounds for one of the $s$-goodness characteristics which leads to verifiable sufficient conditions for exact low-rank matrix recovery.

Citation

Reseach report, June 2011

Article

Download

View Sufficient Conditions for Low-rank Matrix Recovery,Translated from Sparse Signal Recovery