-

 

 

 




Optimization Online





 

An Efficient Gauss-Newton Algorithm for Symmetric Low-Rank Product Matrix Approximations

XIN LIU (liuxin***at***lsec.cc.ac.cn)
ZAIWEN WEN (wenzw***at***math.pku.edu.cn)
ZHANG YIN (yzhang***at***rice.edu)

Abstract: We derive and study a Gauss-Newton method for computing a symmetric low-rank product that is the closest to a given symmetric matrix in Frobenius norm. Our Gauss-Newton method, which has a particularly simple form, shares the same order of iteration-complexity as a gradient method when the size of desired eigenspace is small, but can be significantly faster on a wide range of problems. In this paper, we prove global convergence and a Q-linear convergence rate for this algorithm, and perform numerical experiments on various test problems, including those from recently active areas of matrix completion and robust principal component analysis. Numerical results show that the proposed algorithm is capable of providing considerable speed advantages over Krylov subspace methods on suitable application problems. Moreover, the algorithm possesses a higher degree of concurrency than Krylov subspace methods, thus offering better scalability on modern multi/many-core computers.

Keywords: Eigenvalue decomposition, singular value decomposition, low-rank product matrix approximation, Gauss-Newton methods

Category 1: Nonlinear Optimization (Nonlinear Systems and Least-Squares )

Category 2: Nonlinear Optimization (Constrained Nonlinear Optimization )

Citation: June 2014

Download: [PDF]

Entry Submitted: 06/02/2014
Entry Accepted: 06/02/2014
Entry Last Modified: 01/21/2015

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society