  


KullbackLeibler Divergence Constrained Distributionally Robust Optimization
Zhaolin Hu (huzhaolingmail.com) Abstract: In this paper we study distributionally robust optimization (DRO) problems where the ambiguity set of the probability distribution is defined by the KullbackLeibler (KL) divergence. We consider DRO problems where the ambiguity is in the objective function, which takes a form of an expectation, and show that the resulted minimax DRO problems can be formulated as a onelayer convex minimization problem. We also consider DRO problems where the ambiguity is in the constraint, which may either be an expectation constraint or a chance constraint. We show that ambiguous expectationconstrained programs may be reformulated as a onelayer convex optimization problem that takes the form of the Benstein approximation of Nemirovski and Shapiro (2006), and the ambiguous chanceconstrained programs (CCPs) may be reformulated as the original CCP with an adjusted confidence level. A number of examples and special cases are also discussed in the paper to show that the reformulated problems may take simple forms that can be solved easily. The main contribution of the paper is to show that the KL divergence constrained DRO problems are often of the same complexity as their original stochastic programming problems and, thus, KL divergence appears a good candidate in modeling distribution ambiguities in mathematical programming. Keywords: Stochastic Programming; Nonlinear Programming; Ambiguity Modeling Category 1: Stochastic Programming Category 2: Robust Optimization Category 3: Nonlinear Optimization Citation: Download: [PDF] Entry Submitted: 11/11/2012 Modify/Update this entry  
Visitors  Authors  More about us  Links  
Subscribe, Unsubscribe Digest Archive Search, Browse the Repository

Submit Update Policies 
Coordinator's Board Classification Scheme Credits Give us feedback 
Optimization Journals, Sites, Societies  