  


Parallel Stochastic Gradient Algorithms for LargeScale Matrix Completion
Benjamin Recht (brechtcs.wisc.edu) Abstract: This paper develops Jellyfish, an algorithm for solving dataprocessing problems with matrixvalued decision variables regularized to have low rank. Particular examples of problems solvable by Jellyfish include matrix completion problems and leastsquares problems regularized by the nuclear norm or the maxnorm. Jellyfish implements a projected incremental gradient method with a biased, random ordering of the increments. This biased ordering allows for a parallel implementation that admits a speedup nearly proportional to the number of processors. On largescale matrix completion tasks, Jellyfish is orders of magnitude more efficient than existing codes. For example, on the Netflix Prize data set, prior art computes rating predictions in approximately 4 hours, while Jellyfish solves the same problem in under 3 minutes. Keywords: Matrix completion, Incremental gradient methods, Parallel computing, Multicore Category 1: Nonlinear Optimization Category 2: Applications  Science and Engineering (DataMining ) Category 3: Optimization Software and Modeling Systems Citation: Download: [PDF] Entry Submitted: 04/26/2011 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  