New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure

Motivated by recent work of Renegar, we present new computational methods and associated computational guarantees for solving convex optimization problems using first-order methods. Our problem of interest is the general convex optimization problem f^* = \min_{x \in Q} f(x), where we presume knowledge of a strict lower bound f_slb < f^*. [Indeed, f_slb is naturally known when optimizing many loss functions in statistics and machine learning (least-squares, logistic loss, exponential loss, total variation loss, etc.) as well as in Renegar's transformed version of the standard conic optimization problem; in all these cases one has f_slb = 0 < f^*.] We introduce a new functional measure called the growth constant G for f(.), that measures how quickly the level sets of f(.) grow relative to the function value, and that plays a fundamental role in the complexity analysis. When f(.) is non-smooth, we present new computational guarantees for the Subgradient Descent Method and for smoothing methods, that can improve existing computational guarantees in several ways, most notably when the initial iterate x^0 is far from the optimal solution set. When f(.) is smooth, we present a scheme for periodically restarting the Accelerated Gradient Method that can also improve existing computational guarantees when x^0 is far from the optimal solution set, and in the presence of added structure we present a scheme using parametrically increased smoothing that further improves the associated computational guarantees.

Citation

MIT Operations Research Center Working Paper, MIT, September 2015.

Article

Download

View New Computational Guarantees for Solving Convex Optimization Problems with First Order Methods, via a Function Growth Condition Measure