Stochastic Heavy ball

Abstract : This paper deals with a natural stochastic optimization procedure derived from the so-called Heavy-ball method differential equation, which was introduced by Polyak in the 1960s with his seminal contribution [Pol64]. The Heavy-ball method is a second-order dynamics that was investigated to minimize convex functions f. The family of second-order methods recently received a large amount of attention, until the famous contribution of Nesterov [Nes83], leading to the explosion of large-scale optimization problems. This work provides an in-depth description of the stochastic heavy-ball method, which is an adaptation of the deterministic one when only unbiased evalutions of the gradient are available and used throughout the iterations of the algorithm. We first describe some almost sure convergence results in the case of general non-convex coercive functions f. We then examine the situation of convex and strongly convex potentials and derive some non-asymptotic results about the stochastic heavy-ball method. We end our study with limit theorems on several rescaled algorithms.
Type de document :
Article dans une revue
Electronic journal of statistics , Shaker Heights, OH : Institute of Mathematical Statistics, 2018, 12 (1), pp.461-529. 〈10.1214/18-EJS1395〉
Liste complète des métadonnées

https://hal.archives-ouvertes.fr/hal-01402683
Contributeur : Fabien Panloup <>
Soumis le : lundi 18 février 2019 - 09:36:26
Dernière modification le : vendredi 1 mars 2019 - 01:26:38

Fichier

EJS_gadat_panloup_saadane.pdf
Fichiers éditeurs autorisés sur une archive ouverte

Identifiants

Citation

Sébastien Gadat, Fabien Panloup, Sofiane Saadane. Stochastic Heavy ball. Electronic journal of statistics , Shaker Heights, OH : Institute of Mathematical Statistics, 2018, 12 (1), pp.461-529. 〈10.1214/18-EJS1395〉. 〈hal-01402683v2〉

Partager

Métriques

Consultations de la notice

8

Téléchargements de fichiers

7