-

 

 

 




Optimization Online





 

Practical Large-Scale Linear Programming using Primal-Dual Hybrid Gradient

David Applegate(dapplegate***at***google.com)
Mateo Díaz(md825***at***cornell.edu)
Oliver Hinder(ohinder***at***pitt.edu)
Haihao Lu(haihao.lu***at***chicagobooth.edu)
Miles Lubin(mlubin***at***google.com)
Brendan O'Donoghue(bodonoghue***at***deepmind.com)
Warren Schudy(wschudy***at***google.com)

Abstract: We present PDLP, a practical first-order method for linear programming (LP) that can solve to the high levels of accuracy that are expected in traditional LP applications. In addition, it can scale to very large problems because its core operation is matrix-vector multiplications. PDLP is derived by applying the primal-dual hybrid gradient (PDHG) method, popularized by Chambolle and Pock (2011), to a saddle-point formulation of LP. PDLP enhances PDHG for LP by combining several new techniques with older tricks from the literature; the enhancements include diagonal preconditioning, presolving, adaptive step sizes, and adaptive restarting. PDLP improves the state of the art for first-order methods applied to LP. We compare PDLP with SCS, an ADMM-based solver, on a set of 383 LP instances derived from MIPLIB 2017. With a target of $10^{-8}$ relative accuracy and 1 hour time limit, PDLP achieves a 6.3x reduction in the geometric mean of solve times and a 4.6x reduction in the number of instances unsolved (from 227 to 49). Furthermore, we highlight standard benchmark instances and a large-scale application (PageRank) where our open-source prototype of PDLP, written in Julia, outperforms a commercial LP solver.

Keywords: linear programming, convex optimization, first-order methods, computational optimization

Category 1: Linear, Cone and Semidefinite Programming (Linear Programming )

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )

Citation:

Download: [PDF]

Entry Submitted: 06/08/2021
Entry Accepted: 06/09/2021
Entry Last Modified: 06/08/2021

Modify/Update this entry


  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository

 

Submit
Update
Policies
Coordinator's Board
Classification Scheme
Credits
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society