Optimization Online


Chebyshev Inequalities for Products of Random Variables

Napat Rujeerapaiboon(napat.rujeerapaiboon***at***epfl.ch)
Daniel Kuhn(daniel.kuhn***at***epfl.ch)
Wolfram Wiesemann(ww***at***imperial.ac.uk)

Abstract: We derive sharp probability bounds on the tails of a product of symmetric non-negative random variables using only information about their first two moments. If the covariance matrix of the random variables is known exactly, these bounds can be computed numerically using semidefinite programming. If only an upper bound on the covariance matrix is available, the probability bounds on the right tails can be evaluated analytically. The bounds under precise and imprecise covariance information coincide for all left tails as well as for all right tails corresponding to quantiles that are either sufficiently small or sufficiently large. We also prove that all left probability bounds reduce to the trivial bound 1 if the number of random variables in the product exceeds an explicit threshold. Thus, in the worst case, the weak-sense geometric random walk defined through the running product of the random variables is absorbed at 0 with certainty as soon as time exceeds the given threshold. The techniques devised for constructing Chebyshev bounds for products can also be used to derive Chebyshev bounds for sums, maxima and minima of non-negative random variables.

Keywords: Chebyshev inequality, probability bounds, distributionally robust optimization, convex optimization

Category 1: Linear, Cone and Semidefinite Programming (Semi-definite Programming )

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 05/18/2016
Entry Accepted: 05/18/2016
Entry Last Modified: 05/18/2016

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society