Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Mathematical Programming Année : 2020

Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions

Résumé

In this paper we study the convergence properties of a Nesterov’s family of inertial schemes which is a specific case of inertial Gradient Descent algorithm in the context of a smooth convex minimization problem, under some additional hypotheses on the local geometry of the objective function F, such as the growth (or Łojasiewicz) condition. In particular we study the different convergence rates for the objective function and the local variation, depending on these geometric conditions. In this setting we can give optimal convergence rates for this Nesterov scheme. Our analysis shows that there are some situations when Nesterov’s family of inertial schemes is asymptotically less efficient than the gradient descent (e.g. in the case when the objective function is quadratic).
Fichier principal
Vignette du fichier
iGDgeometrybis.pdf (1.09 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01965095 , version 1 (24-12-2018)
hal-01965095 , version 2 (05-04-2019)
hal-01965095 , version 3 (19-09-2019)

Identifiants

Citer

Vassilis Apidopoulos, Jean-François Aujol, Charles H Dossal, Aude Rondepierre. Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions. Mathematical Programming, 2020, ⟨10.1007/s10107-020-01476-3⟩. ⟨hal-01965095v3⟩
462 Consultations
874 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More