SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Journal of Optimization Theory and Applications Année : 2022

SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm

Résumé

A wide class of problems involves the minimization of a coercive and differentiable function F on R N whose gradient cannot be evaluated in an exact manner. In such context, many existing convergence results from standard gradientbased optimization literature cannot be directly applied and robustness to errors in the gradient is not necessarily guaranteed. This work is dedicated to investigating the convergence of Majorization-Minimization (MM) schemes when stochastic errors affect the gradient terms. We introduce a general stochastic optimization framework, called SABRINA (StochAstic suBspace majoRIzation-miNimization Algorithm) that encompasses MM quadratic schemes possibly enhanced with a subspace acceleration strategy. New asymptotical results are built for the stochastic process generated by SABRINA. Two sets of numerical experiments in the field of machine learning and image processing are presented to support our theoretical results and illustrate the good performance of SABRINA with respect to state-of-the-art gradient-based stochastic optimization methods.
Fichier principal
Vignette du fichier
main.pdf (782.55 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03793623 , version 1 (01-10-2022)

Identifiants

  • HAL Id : hal-03793623 , version 1

Citer

Emilie Chouzenoux, Jean-Baptiste Fest. SABRINA: A Stochastic Subspace Majorization-Minimization Algorithm. Journal of Optimization Theory and Applications, 2022, 195, pp.919-952. ⟨hal-03793623⟩
47 Consultations
126 Téléchargements

Partager

Gmail Facebook X LinkedIn More