Phi-Divergence Constrained Ambiguous Stochastic Programs for Data-Driven Optimization

This paper investigates the use of phi-divergences in ambiguous (or distributionally robust) two-stage stochastic programs. Classical stochastic programming assumes the distribution of uncertain parameters are known. However, the true distribution is unknown in many applications. Especially in cases where there is little data or not much trust in the data, an ambiguity set of distributions can be used to hedge against the distributional uncertainty. Phi-divergences (e.g., Kullback-Leibler divergence, Chi squared distance, etc.) provide a natural way to create an ambiguity set of distributions that are centered around a nominal distribution. The nominal distribution can be obtained by using observed data, expert opinions, simulations, and so forth. In this paper, we present a classification of phi-divergences to elucidate their use for models with different properties and sources of data. We illustrate our classification on phi-divergences that result in common risk optimization models. A condition for assessing the value of collecting additional data is derived, and we demonstrate that the phi-divergence-based ambiguous program behaves essentially the same as the associated non-ambiguous stochastic program as more data is collected. We present a decomposition-based solution algorithm to solve the resulting model. Finally, we demonstrate the behavior of phi-divergences in an optimization setting for a numerical example.

Article

Download

View Phi-Divergence Constrained Ambiguous Stochastic Programs for Data-Driven Optimization