Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains

Aymeric Dieuleveut 1, 2 Alain Durmus 3 Francis Bach 1
1 SIERRA - Statistical Machine Learning and Parsimony
DI-ENS - Département d'informatique de l'École normale supérieure, CNRS - Centre National de la Recherche Scientifique, Inria de Paris
Abstract : We consider the minimization of an objective function given access to unbiased estimates of its gradient through stochastic gradient descent (SGD) with constant step-size. While the detailed analysis was only performed for quadratic functions, we provide an explicit asymptotic expansion of the moments of the averaged SGD iterates that outlines the dependence on initial conditions, the effect of noise and the step-size, as well as the lack of convergence in the general (non-quadratic) case. For this analysis, we bring tools from Markov chain theory into the analysis of stochastic gradient. We then show that Richardson-Romberg extrapolation may be used to get closer to the global optimum and we show empirical improvements of the new extrapolation scheme.
Type de document :
Pré-publication, Document de travail
2018
Liste complète des métadonnées

Littérature citée [59 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01565514
Contributeur : Alain Durmus <>
Soumis le : mardi 10 avril 2018 - 21:09:38
Dernière modification le : jeudi 26 avril 2018 - 10:29:12

Fichiers

main_arxiv.pdf
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01565514, version 2
  • ARXIV : 1707.06386

Citation

Aymeric Dieuleveut, Alain Durmus, Francis Bach. Bridging the Gap between Constant Step Size Stochastic Gradient Descent and Markov Chains. 2018. 〈hal-01565514v2〉

Partager

Métriques

Consultations de la notice

260

Téléchargements de fichiers

159