Skip to Main content Skip to Navigation
Journal articles

Optimal Convergence Rates for Nesterov Acceleration

Abstract : In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It is well-known since the pioneering work of Nesterov that the rate of convergence $O(1/t^2)$ is optimal for the class of convex functions with Lipschitz gradient. In this work, we show that better convergence rates can be obtained with some additional geometrical conditions, such as \L ojasiewicz property. More precisely, we prove the optimal convergence rates that can be obtained depending on the geometry of the function $F$ to minimize. The convergence rates are new, and they shed new light on the behavior of Nesterov acceleration schemes. We prove in particular that the classical Nesterov scheme may provide convergence rates that are worse than the classical gradient descent scheme on sharp functions: for instance, the convergence rate for strongly convex functions is not geometric for the classical Nesterov scheme (while it is the case for the gradient descent algorithm). This shows that applying the classical Nesterov acceleration on convex functions without looking more at the geometrical properties of the objective functions may lead to sub-optimal algorithms.
Document type :
Journal articles
Complete list of metadatas

Cited literature [26 references]  Display  Hide  Download
Contributor : Aude Rondepierre <>
Submitted on : Monday, June 24, 2019 - 10:54:25 AM
Last modification on : Thursday, May 28, 2020 - 2:48:01 PM


Files produced by the author(s)



Jean François Aujol, Charles Dossal, Aude Rondepierre. Optimal Convergence Rates for Nesterov Acceleration. SIAM Journal on Optimization, Society for Industrial and Applied Mathematics, 2019, 29 (4), pp.3131-3153. ⟨10.1137/18M1186757⟩. ⟨hal-01786117v4⟩



Record views


Files downloads