Algebraic Relaxations and Hardness Results in Polynomial Optimization and Lyapunov Analysis

The contributions of the first half of this thesis are on the computational and algebraic aspects of convexity in polynomial optimization. We show that unless P=NP, there exists no polynomial time (or even pseudo-polynomial time) algorithm that can decide whether a multivariate polynomial of degree four (or higher even degree) is globally convex. This solves a problem that has been open since 1992 when N. Z. Shor asked for the complexity of deciding convexity for quartic polynomials. We also prove that deciding strict convexity, strong convexity, quasiconvexity, and pseudoconvexity of polynomials of even degree four or higher is strongly NP-hard. By contrast, we show that quasiconvexity and pseudoconvexity of odd degree polynomials can be decided in polynomial time. We then turn our attention to sos-convexity---an algebraic sum of squares (sos) based sufficient condition for polynomial convexity that can be efficiently checked with semidefinite programming. We show that three natural formulations for sos-convexity derived from relaxations on the definition of convexity, its first order characterization, and its second order characterization are equivalent. We present the first example of a convex polynomial that is not sos-convex. Our main result then is to prove that the cones of convex and sos-convex polynomials (resp. forms) in $n$ variables and of degree $d$ coincide if and only if $n=1$ or $d=2$ or $(n,d)=(2,4)$ (resp. $n=2$ or $d=2$ or $(n,d)=(3,4)$). Although for disparate reasons, the remarkable outcome is that convex polynomials (resp. forms) are sos-convex exactly in cases where nonnegative polynomials (resp. forms) are sums of squares, as characterized by Hilbert in 1888. The contributions of the second half of this thesis are on the development and analysis of computational techniques for certifying stability of uncertain and nonlinear dynamical systems. We show that deciding asymptotic stability of homogeneous cubic polynomial vector fields is strongly NP-hard. We settle some of the converse questions on existence of polynomial and sum of squares Lyapunov functions. We present a globally asymptotically stable polynomial vector field with no polynomial Lyapunov function. We show via an explicit counterexample that if the degree of the polynomial Lyapunov function is fixed, then sos programming can fail to find a valid Lyapunov function even though one exists. By contrast, we show that if the degree is allowed to increase, then existence of a polynomial Lyapunov function for a planar or a homogeneous polynomial vector field implies existence of a polynomial Lyapunov function that can be found with sos programming. We extend this result to develop a converse sos Lyapunov theorem for robust stability of switched linear systems. In our final chapter, we introduce the framework of path-complete graph Lyapunov functions for approximation of the joint spectral radius. The approach is based on the analysis of the underlying switched system via inequalities imposed between multiple Lyapunov functions associated to a labeled directed graph. Inspired by concepts in automata theory and symbolic dynamics, we define a class of graphs called path-complete graphs, and show that any such graph gives rise to a method for proving stability of switched systems. The semidefinite programs arising from this technique include as special case many of the existing methods such as common quadratic, common sum of squares, and maximum/minimum-of-quadratics Lyapunov functions. We prove approximation guarantees for analysis via several families of path-complete graphs and a constructive converse Lyapunov theorem for maximum/minimum-of-quadratics Lyapunov functions.

Citation

PhD Thesis; MIT; September, 2011.

Article

Download

View Algebraic Relaxations and Hardness Results in Polynomial Optimization and Lyapunov Analysis