Rescaling Algorithms for Linear Programming Part I: Conic feasibility

We propose simple polynomial-time algorithms for two linear conic feasibility problems. For a matrix $A\in \R^{m\times n}$, the {\em kernel problem} requires a positive vector in the kernel of $A$, and the {\em image problem} requires a positive vector in the image of $A^\T$. Both algorithms iterate between simple first order steps and rescaling steps. These rescalings steps improve natural geometric potentials in the domain and image spaces, respectively. If Goffin's condition measure $\hat \rho_A$ is negative, then the kernel problem is feasible and the worst-case complexity of the kernel algorithm is $O\left((m^3n+mn^2)\log{|\hat \rho_A|^{-1}}\right)$; if $\hat\rho_A>0$, then the image problem is feasible and the image algorithm runs in time $O\left(m^2n^2\log{\hat \rho_A^{-1}}\right)$. We also address the degenerate case $\hat\rho_A=0$: we extend our algorithms for finding maximum support nonnegative vectors in the kernel of $A$ and in the image of $A^\top$. We obtain the same running time bounds, with $\hat\rho_A$ replaced by appropriate condition numbers. In case the input matrix $A$ has integer entries and total encoding length $L$, all algorithms are polynomial. Both full support and maximum support kernel algorithms run in time $O\left((m^3n+mn^2)L\right)$, whereas both image algorithms run in time $O\left(m^2n^2L\right)$. The standard linear programming feasibility problem can be easily reduced to either maximum support problems, yielding polynomial-time algorithms for Linear Programming.

Article

Download

View Rescaling Algorithms for Linear Programming Part I: Conic feasibility