  


Tensor Principal Component Analysis via Convex Optimization
Bo Jiang(jiang373umn.edu) Abstract: This paper is concerned with the computation of the principal components for a general tensor, known as the tensor principal component analysis (PCA) problem. We show that the general tensor PCA problem is reducible to its special case where the tensor in question is supersymmetric with an even degree. In that case, the tensor can be embedded into a symmetric matrix. We prove that if the tensor is rankone, then the embedded matrix must be rankone too, and vice versa. The tensor PCA problem can thus be solved by means of matrix optimization under a rankone constraint, for which we propose two solution methods: (1) imposing a nuclear norm penalty in the objective to enforce a lowrank solution; (2) relaxing the rankone constraint by Semidefinite Programming. Interestingly, our experiments show that both methods yield a rankone solution with high probability, thereby solving the original tensor PCA problem to optimality with high probability. To further cope with the size of the resulting convex optimization models, we propose to use the alternating direction method of multipliers, which reduces significantly the computational efforts. Various extensions of the model are considered as well. Keywords: Tensor; Principal Component Analysis; Low Rank; Nuclear Norm; Semidefinite Programming Relaxation Category 1: Convex and Nonsmooth Optimization (Convex Optimization ) Citation: Download: [PDF] Entry Submitted: 12/11/2012 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  