, A 1 (?) = 2b ? (? + 2)?, A 2 (?) = (2? + 1 ? b)(1 ? b), and A, where: c(?) = 2(? + 1 ? b)
, Convergence rate of inertial forward-backward algorithm beyond Nesterov's rule. Mathematical Programming, 2018.
The differential inclusion modeling FISTA algorithm and optimality of convergence rate in the case b ? 3, SIAM Journal on Optimization, vol.28, issue.1, pp.551-574, 2018. ,
URL : https://hal.archives-ouvertes.fr/hal-01517708
On the convergence of the proximal algorithm for nonsmooth functions involving analytic features, Mathematical Programming, vol.116, issue.1, pp.5-16, 2009. ,
URL : https://hal.archives-ouvertes.fr/hal-00803898
Proximal alternating minimization and projection methods for nonconvex problems: An approach based on the kurdyka?ojasiewicz inequality, Mathematics of Operations Research, vol.35, issue.2, pp.438-457, 2010. ,
Convergence rates of inertial forward-backward algorithms, SIAM Journal on Optimization, vol.28, issue.1, pp.849-874, 2018. ,
URL : https://hal.archives-ouvertes.fr/hal-01962223
Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity, Mathematical Programming, pp.123-175, 2018. ,
URL : https://hal.archives-ouvertes.fr/hal-01821929
Rate of convergence of the Nesterov accelerated gradient method in the subcritical case ? ? 3, 2017. ,
The rate of convergence of Nesterov's accelerated forwardbackward method is actually faster than 1/k?2, SIAM Journal on Optimization, vol.26, issue.3, pp.1824-1834, 2016. ,
Optimal rate of convergence of an ode associated to the fast gradient descent schemes for b > 0. submitted to to, Journal of Differential Equations, 2017. ,
URL : https://hal.archives-ouvertes.fr/hal-01547251
Optimal convergence rates for Nesterov acceleration, 2018. ,
A fast iterative shrinkage-thresholding algorithm for linear inverse problems, SIAM journal on imaging sciences, vol.2, issue.1, pp.183-202, 2009. ,
The ?ojasiewicz inequality for nonsmooth subanalytic functions with applications to subgradient dynamical systems, SIAM Journal on Optimization, vol.17, issue.4, pp.1205-1223, 2006. ,
Characterizations of ?ojasiewicz inequalities: subgradient flows, talweg, convexity. Transactions of the, vol.362, pp.3319-3363, 2010. ,
From error bounds to the complexity of first-order descent methods for convex functions, Mathematical Programming, vol.165, issue.2, pp.471-507, 2017. ,
On the long time behavior of second order differential equations with asymptotically small dissipation, Transactions of the American Mathematical Society, vol.361, issue.11, pp.5983-6017, 2009. ,
Backtracking strategies for accelerated descent methods with smooth composite objectives, 2017. ,
URL : https://hal.archives-ouvertes.fr/hal-01596103
On the convergence of the iterates of the "fast iterative shrinkage/thresholding algorithm, Journal of Optimization Theory and Applications, vol.166, issue.3, pp.968-982, 2015. ,
Error bounds, quadratic growth, and linear convergence of proximal methods, Mathematics of Operations Research, vol.43, issue.3, pp.919-948, 2018. ,
Restarting accelerated gradient methods with a rough strong convexity estimate, 2016. ,
URL : https://hal.archives-ouvertes.fr/hal-02287730
Adaptive restart of accelerated gradient methods under local quadratic growth condition, 2017. ,
URL : https://hal.archives-ouvertes.fr/hal-02269132
Splitting methods with variable metric for Kurdyka-?ojasiewicz functions and general convergence rates, Journal of Optimization Theory and Applications, vol.165, issue.3, pp.874-900, 2015. ,
, Convergence of the forward-backward algorithm: Beyond the worst case with the help of geometry, 2017.
New proximal point algorithms for convex minimization, SIAM Journal on Optimization, vol.2, issue.4, pp.649-664, 1992. ,
Discrete Gronwall lemma and applications, MAA-NCS meeting at the University of North Dakota, vol.24, pp.1-7, 2009. ,
On the proximal gradient algorithm with alternated inertia, Journal of Optimization Theory and Applications, vol.176, issue.3, pp.688-710, 2018. ,
URL : https://hal.archives-ouvertes.fr/hal-01685859
Error bounds and hölder metric subregularity. Set-Valued and Variational Analysis, vol.23, pp.705-736, 2015. ,
Analysis and design of optimization algorithms via integral quadratic constraints, SIAM Journal on Optimization, vol.26, issue.1, pp.57-95, 2016. ,
Activity identification and local linear convergence of forward-backward-type methods, SIAM Journal on Optimization, vol.27, issue.1, pp.408-437, 2017. ,
URL : https://hal.archives-ouvertes.fr/hal-01658850
Adaptive accelerated gradient converging method under h\"{o} lderian error bound condition, Advances in Neural Information Processing Systems 30 (NIPS), pp.3106-3116, 2017. ,
Une propriété topologique des sous-ensembles analytiques réels, Les Équations aux Dérivées Partielles, pp.87-89, 1962. ,
Sur la géométrie semi-et sous-analytique, vol.43, pp.1575-1595, 1993. ,
Asymptotic for a second order evolution equation with convex potential and vanishing damping term, 2015. ,
Convergence to equilibrium for the backward Euler scheme and applications, Communications on Pure & Applied Analysis, vol.9, issue.3, pp.685-702, 2010. ,
Linear convergence of first order methods for non-strongly convex optimization, Mathematical Programming, pp.1-39, 2018. ,
Optimal methods of smooth convex minimization, USSR Computational Mathematics and Mathematical Physics, vol.25, issue.2, pp.21-309, 1985. ,
A method of solving a convex programming problem with convergence rate o(1/k 2 ) ,
, In Soviet Mathematics Doklady, vol.27, pp.372-376, 1983.
Introductory lectures on convex optimization: A basic course, 2013. ,
Adaptive restart for accelerated gradient schemes, Foundations of computational mathematics, vol.15, issue.3, pp.715-732, 2015. ,
Gradient methods for the minimisation of functionals, USSR Computational Mathematics and Mathematical Physics, vol.3, issue.4, pp.864-878, 1963. ,
Some methods of speeding up the convergence of iteration methods, USSR Computational Mathematics and Mathematical Physics, vol.4, issue.5, pp.1-17, 1964. ,
, , vol.317, 2009.
Alexandre d'Aspremont. Sharpness, restart and acceleration, Advances in Neural Information Processing Systems, pp.1119-1129, 2017. ,
Convergence rates of inexact proximal-gradient methods for convex optimization, NIPS, 2011. ,
URL : https://hal.archives-ouvertes.fr/inria-00618152
Integration methods and optimization algorithms, Advances in Neural Information Processing Systems, vol.30, pp.1109-1118, 2017. ,
URL : https://hal.archives-ouvertes.fr/hal-01474045
A differential equation for modeling Nesterov's accelerated gradient method: theory and insights, Journal of Machine Learning Research, vol.17, issue.153, pp.1-43, 2016. ,