A Regularized Smoothing Method for Fully Parameterized Convex Problems with Applications to Convex and Nonconvex Two-Stage Stochastic Programming
Pedro Borges (pedro.borges.melogmail.com)
Abstract: We present an approach to regularize and approximate solution mappings of parametric convex optimization problems that combines interior penalty (log-barrier) solutions with Tikhonov regularization. Because the regularized mappings are single-valued and smooth under reasonable conditions, they can be used to build a computationally practical smoothing for the associated optimal value function. The value function in question, while resulting from parameterized convex problems, need not be convex. One motivating application of interest is two-stage (possibly nonconvex) stochastic programming. We show that our approach, being computationally implementable, provides locally bounded upper bounds for the subdifferential of the value function of qualified convex problems. As a by-product of our development, we also recover that in the given setting the value function is locally Lipschitz continuous. Numerical experiments are presented for two-stage convex stochastic programming problems, comparing the approach with the bundle method for nonsmooth optimization.
Keywords: Smoothing Techniques, Interior Penalty Solutions, Tikhonov Regularization, Nonconvex Stochastic Programming, Two-Stage Stochastic Programming, Lipschitz Continuity of Value Functions.
Category 1: Convex and Nonsmooth Optimization (Nonsmooth Optimization )
Category 2: Stochastic Programming
Category 3: Nonlinear Optimization
Entry Submitted: 01/07/2020
Modify/Update this entry
|Visitors||Authors||More about us||Links|
Search, Browse the Repository
Give us feedback
|Optimization Journals, Sites, Societies|