-

 

 

 




Optimization Online





 

A First-Order Smoothing Technique for a Class of Large-Scale Linear Programs

Jieqiu Chen(jieqchen***at***mcs.anl.gov)
Sam Burer(samuel-burer***at***uiowa.edu)

Abstract: We study a class of linear programming (LP) problems motivated by large-scale machine learning applications. After reformulating the LP as a convex nonsmooth problem, we apply Nesterov's primal-dual smoothing technique. It turns out that the iteration complexity of the smoothing technique depends on a parameter $\th$ that arises because we need to bound the originally unbounded primal feasible set. We design a strategy that dynamically updates $\th$ to speed up the convergence. The application of our algorithm to two machine learning problems demonstrates several advantages of the smoothing technique over existing methods.

Keywords: smoothing technique, large-scale linear programming, nonsmooth optimization, machine learning

Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )

Category 2: Linear, Cone and Semidefinite Programming (Linear Programming )

Category 3: Applications -- Science and Engineering (Data-Mining )

Citation: Argonne Preprint ANL/MCS-P1971-1011, November 2011.

Download: [PDF]

Entry Submitted: 11/07/2011
Entry Accepted: 11/07/2011
Entry Last Modified: 11/07/2011

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society