Optimization Online


SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm

Jean-Baptiste Fest (jean-baptiste.fest***at***inria.fr)
Emilie Chouzenoux (emilie.chouzenoux***at***centralesupelec.fr)

Abstract: A wide class of problems involves the minimization of a coercive and differentiable function $F$ on $\mathbb{R}^N$ whose gradient cannot be evaluated in an exact manner. In such context, many existing convergence results from standard gradient-based optimization literature cannot be directly applied and robustness to errors in the gradient is not necessarily guaranteed. This work is dedicated to investigating the convergence of Majorization-Minimization (MM) schemes when stochastic errors affect the gradient terms. We introduce a general stochastic optimization framework, called SABRINA (StochAstic suBspace majoRIzation-miNimization Algorithm) that encompasses MM quadratic schemes possibly enhanced with a subspace acceleration strategy. New asymptotical results are built for the stochastic process generated by SABRINA. Two sets of numerical experiments in the field of machine learning and image processing are presented to support our theoretical results and illustrate the good performance of SABRINA with respect to state-of-the-art gradient-based stochastic optimization methods.

Keywords: Stochastic optimization, convergence analysis, Majorization-Minimization, subspace acceleration, binary logistic regression, image reconstruction.

Category 1: Nonlinear Optimization

Category 2: Convex and Nonsmooth Optimization (Convex Optimization )


Download: [PDF]

Entry Submitted: 08/31/2021
Entry Accepted: 08/31/2021
Entry Last Modified: 05/11/2022

Modify/Update this entry

  Visitors Authors More about us Links
  Subscribe, Unsubscribe
Digest Archive
Search, Browse the Repository


Coordinator's Board
Classification Scheme
Give us feedback
Optimization Journals, Sites, Societies
Mathematical Optimization Society