Optimization Online


Sufficient Conditions for Low-rank Matrix Recovery,Translated from Sparse Signal Recovery

Lingchen Kong(konglchen***at***126.com)
Levent Tun¸cel:(ltuncel***at***math.uwaterloo.ca)
Naihua Xiu(nhxiu***at***bjtu.edu.cn)

Abstract: The low-rank matrix recovery (LMR) is a rank minimization problem subject to linear equality constraints, and it arises in many fields such as signal and image processing, statistics, computer vision, system identification and control. This class of optimization problems is $NP$-hard and a popular approach replaces the rank function with the nuclear norm of the matrix variable. In this paper, we extend the concept of $s$-goodness for a sensing matrix in sparse signal recovery (proposed by Juditsky and Nemirovski [Math Program, 2011]) to linear transformations in LMR. Then, we give characterizations of $s$-goodness in the context of LMR. Using the two characteristic $s$-goodness constants, ${\gamma}_s$ and $\hat{\gamma}_s$, of a linear transformation, not only do we derive necessary and sufficient conditions for a linear transformation to be $s$-good, but also provide sufficient conditions for exact and stable $s$-rank matrix recovery via the nuclear norm minimization under mild assumptions. Moreover, we give computable upper bounds for one of the $s$-goodness characteristics which leads to verifiable sufficient conditions for exact low-rank matrix recovery.

Keywords: Low-rank matrix recovery, necessary and sufficient conditions, $s$-goodness, restricted isometry constant

Category 1: Convex and Nonsmooth Optimization

Category 2: Nonlinear Optimization

Citation: Reseach report, June 2011

Download: [PDF]

Entry Submitted: 06/15/2011
Entry Accepted: 06/17/2011
Entry Last Modified: 06/15/2011

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society