Skip to Main content Skip to Navigation
Journal articles

Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions

Abstract : In this paper we study the convergence properties of a Nesterov’s family of inertial schemes which is a specific case of inertial Gradient Descent algorithm in the context of a smooth convex minimization problem, under some additional hypotheses on the local geometry of the objective function F, such as the growth (or Łojasiewicz) condition. In particular we study the different convergence rates for the objective function and the local variation, depending on these geometric conditions. In this setting we can give optimal convergence rates for this Nesterov scheme. Our analysis shows that there are some situations when Nesterov’s family of inertial schemes is asymptotically less efficient than the gradient descent (e.g. in the case when the objective function is quadratic).
Document type :
Journal articles
Complete list of metadata

Cited literature [47 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01965095
Contributor : Vassilis Apidopoulos Connect in order to contact the contributor
Submitted on : Thursday, September 19, 2019 - 4:30:51 PM
Last modification on : Saturday, December 4, 2021 - 3:43:44 AM
Long-term archiving on: : Sunday, February 9, 2020 - 12:49:01 AM

File

iGDgeometrybis.pdf
Files produced by the author(s)

Identifiers

Citation

Vassilis Apidopoulos, Jean-François Aujol, Charles Dossal, Aude Rondepierre. Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions. Mathematical Programming, Springer Verlag, 2020, ⟨10.1007/s10107-020-01476-3⟩. ⟨hal-01965095v3⟩

Share

Metrics

Record views

294

Files downloads

595