  


From Data to Decisions: Distributionally Robust Optimization is Optimal
Bart P.G. Van Parys (vanparysmit.edu) Abstract: We study stochastic programs where the decisionmaker cannot observe the distribution of the exogenous uncertainties but has access to a finite set of independent samples from this distribution. In this setting, the goal is to find a procedure that transforms the data to an estimate of the expected cost function under the unknown datagenerating distribution, i.e., a predictor, and an optimizer of the estimated cost function that serves as a nearoptimal candidate decision, i.e., a prescriptor. As functions of the data, predictors and prescriptors constitute statistical estimators. We propose a metaoptimization problem to find the least conservative predictors and prescriptors subject to constraints on their outofsample disappointment. The outofsample disappointment quantifies the probability that the actual expected cost of the candidate decision under the unknown true distribution exceeds its predicted cost. Leveraging tools from large deviations theory, we prove that this metaoptimization problem admits a unique solution: The best predictorprescriptor pair is obtained by solving a distributionally robust optimization problem over all distributions within a given relative entropy distance from the empirical distribution of the data. Keywords: DataDriven Optimization, Distributionally Robust Optimization, Large Deviations Theory, Relative Entropy, Convex Optimization, Observed Fisher Information Category 1: Stochastic Programming Category 2: Robust Optimization Category 3: Linear, Cone and Semidefinite Programming (SecondOrder Cone Programming ) Citation: Download: [PDF] Entry Submitted: 04/12/2017 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  