Online First-Order Framework for Robust Convex Optimization

Robust optimization (RO) has emerged as one of the leading paradigms to efficiently model parameter uncertainty. The recent connections between RO and problems in statistics and machine learning domains demand for solving RO problems in ever more larger scale. However, the traditional approaches for solving RO formulations based on building and solving robust counterparts or the iterative approaches utilizing nominal feasibility oracles can be prohibitively expensive and thus significantly hinder the scalability of RO paradigm. In this paper, we present a general and flexible iterative framework to approximately solve robust convex optimization problems that is built on a fully online first-order paradigm. In comparison to the existing literature, a key distinguishing feature of our approach is that it only requires access to first-order oracles that are remarkably cheaper than pessimization or nominal feasibility oracles, while maintaining the same convergence rates. This, in particular, makes our approach much more scalable and hence preferable in large-scale applications, specifically those from machine learning and statistics domains. We also provide new interpretations of existing iterative approaches in our framework and illustrate our framework on robust quadratic programming.

Citation

Technical report, Tepper School of Business, Carnegie Mellon University, July 2016

Article

Download

View Online First-Order Framework for Robust Convex Optimization