, A 1 (?) = 2b ? (? + 2)?, A 2 (?) = (2? + 1 ? b)(1 ? b), and A, where: c(?) = 2(? + 1 ? b)

V. Apidopoulos, J. Aujol, and C. Dossal, Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule. Mathematical Programming, 2018.

V. Apidopoulos, J. Aujol, and C. Dossal, The differential inclusion modeling FISTA algorithm and optimality of convergence rate in the case b ? 3, SIAM Journal on Optimization, vol.28, issue.1, pp.551-574, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01517708

H. Attouch and J. Bolte, On the convergence of the proximal algorithm for nonsmooth functions involving analytic features, Mathematical Programming, vol.116, issue.1, pp.5-16, 2009.
URL : https://hal.archives-ouvertes.fr/hal-00803898

H. Attouch, J. Bolte, P. Redont, and A. Soubeyran, Proximal alternating minimization and projection methods for nonconvex problems: An approach based on the kurdyka?ojasiewicz inequality, Mathematics of Operations Research, vol.35, issue.2, pp.438-457, 2010.

H. Attouch and A. Cabot, Convergence rates of inertial forward-backward algorithms, SIAM Journal on Optimization, vol.28, issue.1, pp.849-874, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01962223

H. Attouch, Z. Chbani, J. Peypouquet, and P. Redont, Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity, Mathematical Programming, pp.123-175, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01821929

H. Attouch, Z. Chbani, and H. Riahi, Rate of convergence of the Nesterov accelerated gradient method in the subcritical case ? ? 3, 2017.

H. Attouch and J. Peypouquet, The rate of convergence of Nesterov's accelerated forwardbackward method is actually faster than 1/k?2, SIAM Journal on Optimization, vol.26, issue.3, pp.1824-1834, 2016.

J. Aujol and C. Dossal, Optimal rate of convergence of an ode associated to the fast gradient descent schemes for b > 0. submitted to to, Journal of Differential Equations, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01547251

C. Jean-françois-aujol, A. Dossal, and . Rondepierre, Optimal convergence rates for Nesterov acceleration, 2018.

A. Beck and M. Teboulle, A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM journal on imaging sciences, vol.2, issue.1, pp.183-202, 2009.

J. Bolte, . Aris, A. Daniilidis, and . Lewis, The ?ojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems, SIAM Journal on Optimization, vol.17, issue.4, pp.1205-1223, 2006.

J. Bolte, A. Daniilidis, O. Ley, and L. Mazet, Characterizations of ?ojasiewicz inequalities: subgradient flows, talweg, convexity. Transactions of the, vol.362, pp.3319-3363, 2010.

J. Bolte, J. Trong-phong-nguyen, B. W. Peypouquet, and . Suter, From error bounds to the complexity of first-order descent methods for convex functions, Mathematical Programming, vol.165, issue.2, pp.471-507, 2017.

A. Cabot, H. Engler, and S. Gadat, On the long time behavior of second order differential equations with asymptotically small dissipation, Transactions of the American Mathematical Society, vol.361, issue.11, pp.5983-6017, 2009.

L. Calatroni and A. Chambolle, Backtracking strategies for accelerated descent methods with smooth composite objectives, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01596103

A. Chambolle and C. Dossal, On the convergence of the iterates of the "fast iterative shrinkage/thresholding algorithm, Journal of Optimization Theory and Applications, vol.166, issue.3, pp.968-982, 2015.

D. Drusvyatskiy and A. Lewis, Error bounds, quadratic growth, and linear convergence of proximal methods, Mathematics of Operations Research, vol.43, issue.3, pp.919-948, 2018.

O. Fercoq and Z. Qu, Restarting accelerated gradient methods with a rough strong convexity estimate, 2016.
URL : https://hal.archives-ouvertes.fr/hal-02287730

O. Fercoq and Z. Qu, Adaptive restart of accelerated gradient methods under local quadratic growth condition, 2017.
URL : https://hal.archives-ouvertes.fr/hal-02269132

P. Frankel, G. Garrigos, and J. Peypouquet, Splitting methods with variable metric for Kurdyka-?ojasiewicz functions and general convergence rates, Journal of Optimization Theory and Applications, vol.165, issue.3, pp.874-900, 2015.

G. Garrigos, L. Rosasco, and S. Villa, Convergence of the forward-backward algorithm: Beyond the worst case with the help of geometry, 2017.

O. Güler, New proximal point algorithms for convex minimization, SIAM Journal on Optimization, vol.2, issue.4, pp.649-664, 1992.

M. John and . Holte, Discrete Gronwall lemma and applications, MAA-NCS meeting at the University of North Dakota, vol.24, pp.1-7, 2009.

F. Iutzeler and J. Malick, On the proximal gradient algorithm with alternated inertia, Journal of Optimization Theory and Applications, vol.176, issue.3, pp.688-710, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01685859

Y. Alexander and . Kruger, Error bounds and hölder metric subregularity. Set-Valued and Variational Analysis, vol.23, pp.705-736, 2015.

L. Lessard, B. Recht, and A. Packard, Analysis and design of optimization algorithms via integral quadratic constraints, SIAM Journal on Optimization, vol.26, issue.1, pp.57-95, 2016.

J. Liang, J. Fadili, and G. Peyré, Activity identification and local linear convergence of forward-backward-type methods, SIAM Journal on Optimization, vol.27, issue.1, pp.408-437, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01658850

M. Liu and T. Yang, Adaptive accelerated gradient converging method under h\"{o} lderian error bound condition, Advances in Neural Information Processing Systems 30 (NIPS), pp.3106-3116, 2017.

. Stanis?aw and . ?ojasiewicz, Une propriété topologique des sous-ensembles analytiques réels, Les Équations aux Dérivées Partielles, pp.87-89, 1962.

. Stanis?aw and . ?ojasiewicz, Sur la géométrie semi-et sous-analytique, vol.43, pp.1575-1595, 1993.

R. May, Asymptotic for a second order evolution equation with convex potential and vanishing damping term, 2015.

B. Merlet and M. Pierre, Convergence to equilibrium for the backward Euler scheme and applications, Communications on Pure & Applied Analysis, vol.9, issue.3, pp.685-702, 2010.

I. Necoara, Y. Nesterov, and F. Glineur, Linear convergence of first order methods for non-strongly convex optimization, Mathematical Programming, pp.1-39, 2018.

S. Arkadi, Y. E. Nemirovskii, and . Nesterov, Optimal methods of smooth convex minimization, USSR Computational Mathematics and Mathematical Physics, vol.25, issue.2, pp.21-309, 1985.

Y. Nesterov, A method of solving a convex programming problem with convergence rate o(1/k 2 )

, In Soviet Mathematics Doklady, vol.27, pp.372-376, 1983.

Y. Nesterov, Introductory lectures on convex optimization: A basic course, 2013.

O. Brendan, E. 'donoghue, and . Candes, Adaptive restart for accelerated gradient schemes, Foundations of computational mathematics, vol.15, issue.3, pp.715-732, 2015.

T. Boris and . Polyak, Gradient methods for the minimisation of functionals, USSR Computational Mathematics and Mathematical Physics, vol.3, issue.4, pp.864-878, 1963.

T. Boris and . Polyak, Some methods of speeding up the convergence of iteration methods, USSR Computational Mathematics and Mathematical Physics, vol.4, issue.5, pp.1-17, 1964.

. R-tyrrell-rockafellar, J. Roger, and . Wets, , vol.317, 2009.

V. Roulet, Alexandre d'Aspremont. Sharpness, restart and acceleration, Advances in Neural Information Processing Systems, pp.1119-1129, 2017.

M. Schmidt, N. L. Roux, and F. Bach, Convergence rates of inexact proximal-gradient methods for convex optimization, NIPS, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00618152

D. Scieur, V. Roulet, F. Bach, A. Aspremont, ;. I. Guyon et al., Integration methods and optimization algorithms, Advances in Neural Information Processing Systems, vol.30, pp.1109-1118, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01474045

W. Su, S. Boyd, and E. J. Candes, A differential equation for modeling Nesterov's accelerated gradient method: theory and insights, Journal of Machine Learning Research, vol.17, issue.153, pp.1-43, 2016.