Sample Complexity of Smooth Stochastic Optimization

Let $N(\epsilon,\delta)$ be the number of samples needed when solving a stochastic program such that the objective function evaluated at the sample optimizer is within $\epsilon$ of the true optimum with probability $1-\delta$. Previous results are of the form $N(\epsilon,\delta)=O(\epsilon^{-2}\log \delta^{-1})$. However, a smooth objective function is often locally quadratic at an interior optimum. For that case we use results on the convergence of the sample optimizers, to show that $N(\epsilon,\delta)=O(\epsilon^{-1}\log\delta^{-1})$. These results are both bounds and asymptotics. Hence we show for a common case (smooth objective functions with an interior optimum), that the number of samples needed is $O(\epsilon^{-1})$. This suggests that stochastic optimization is a practical approach for such problems.

Article

Download

View Sample Complexity of Smooth Stochastic Optimization