Z. Allen-zhu, Natasha: Faster stochastic non-convex optimization via strongly non-convex parameter, 2016.

A. Beck and M. Teboulle, A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems, SIAM Journal on Imaging Sciences, vol.2, issue.1, pp.183-202, 2009.
DOI : 10.1137/080716542

D. P. Bertsekas, Nonlinear programming, Athena scientific Belmont, 1999.

D. P. Bertsekas, Convex Optimization Algorithms, Athena Scientific, 2015.

J. Bolte, T. P. Nguyen, J. Peypouquet, and B. Suter, From error bounds to the complexity of first-order descent methods for convex functions, Mathematical Programming, 2016.
DOI : 10.1007/s10107-016-1091-6

J. M. Borwein and A. S. Lewis, Convex analysis and nonlinear optimization: Theory and examples, 2006.

Y. Carmon, J. C. Duchi, O. Hinder, and A. Sidford, Accelerated methods for non-convex optimization, 2016.

Y. Carmon, O. Hinder, J. C. Duchi, and A. Sidford, Convex until proven guilty " : Dimension-free acceleration of gradient descent on non-convex functions, 2017.

C. Cartis, N. I. Gould, and P. L. Toint, On the complexity of finding first-order critical points in constrained nonlinear optimization, Mathematical Programming, 2014.

C. Cartis, N. I. Gould, and P. L. Toint, On the Complexity of Steepest Descent, Newton's and Regularized Newton's Methods for Nonconvex Unconstrained Optimization Problems, SIAM Journal on Optimization, vol.20, issue.6, pp.2833-2852, 2010.
DOI : 10.1137/090774100

F. H. Clarke, R. J. Stern, and P. R. Wolenski, Proximal smoothness and the lower-C 2 property, Journal of Convex Analysis, vol.2, issue.12, pp.117-144, 1995.

A. J. Defazio, F. Bach, and S. Lacoste-julien, SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives, Advances in Neural Information Processing Systems (NIPS), 2014.
URL : https://hal.archives-ouvertes.fr/hal-01016843

D. Drusvyatskiy and C. Paquette, Efficiency of minimizing compositions of convex functions and smooth maps, 2016.

H. Federer, Curvature measures. Transactions of the, pp.418-491, 1959.
DOI : 10.2307/1993504

S. Ghadimi and G. Lan, Accelerated gradient methods for nonconvex nonlinear and stochastic programming, Mathematical Programming, pp.59-99, 2016.
DOI : 10.1007/s10107-015-0871-8

URL : http://arxiv.org/abs/1310.3787

S. Ghadimi, G. Lan, and H. Zhang, Generalized uniformly optimal methods for nonlinear programming, 2015.

O. Güler, On the Convergence of the Proximal Point Algorithm for Convex Minimization, SIAM Journal on Control and Optimization, vol.29, issue.2, pp.403-419, 1991.
DOI : 10.1137/0329022

T. Hastie, R. Tibshirani, and M. Wainwright, Statistical Learning With Sparsity: The Lasso And Generalizations, 2015.

R. Johnson and T. Zhang, Accelerating stochastic gradient descent using predictive variance reduction, Advances in Neural Information Processing Systems (NIPS), 2013.

G. Lan, An optimal randomized incremental gradient method, 2015.

H. Li and Z. Lin, Accelerated proximal gradient methods for nonconvex programming, Advances in Neural Information Processing Systems (NIPS), 2015.

H. Lin, J. Mairal, and Z. Harchaoui, A universal catalyst for first-order optimization, Advances in Neural Information Processing Systems (NIPS), 2015.
URL : https://hal.archives-ouvertes.fr/hal-01160728

J. Mairal, F. Bach, and J. Ponce, Sparse modeling for image and vision processing. Foundations and Trends in Computer Graphics and Vision, pp.85-283, 2014.
DOI : 10.1561/0600000058

URL : https://hal.archives-ouvertes.fr/hal-01081139

J. Mairal, F. Bach, J. Ponce, and G. Sapiro, Online learning for matrix factorization and sparse coding, Journal of Machine Learning Research (JMLR), vol.11, pp.19-60, 2010.
URL : https://hal.archives-ouvertes.fr/inria-00408716

Y. Nesterov, A method of solving a convex programming problem with convergence rate, Soviet Mathematics Doklady, vol.27, issue.1 22, pp.372-376, 1983.

Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, 2004.
DOI : 10.1007/978-1-4419-8853-9

Y. Nesterov, How to make the gradients small. OPTIMA, MPS Newsletter, issue.88, pp.10-11, 2012.

Y. Nesterov, Gradient methods for minimizing composite functions, Mathematical Programming, pp.125-161, 2013.
DOI : 10.1007/s10107-012-0629-5

N. Parikh and S. P. Boyd, Proximal Algorithms, Foundations and Trends?? in Optimization, vol.1, issue.3, pp.123-231, 2014.
DOI : 10.1561/2400000003

R. A. Poliquin and R. T. Rockafellar, Prox-regular functions in variational analysis. Transactions of the, pp.1805-1838, 1996.

S. J. Reddi, S. Sra, B. Poczos, and A. J. Smola, Proximal stochastic methods for nonsmooth nonconvex finite-sum optimization, Advances in Neural Information Processing Systems (NIPS), 2016.

R. T. Rockafellar, Favorable classes of Lipschitz-continuous functions in subgradient optimization, Progress in nondifferentiable optimization of IIASA Collaborative Proc. Ser. CP-82, pp.125-143, 1982.

R. T. Rockafellar and R. J. Wets, Variational analysis, of Grundlehren der Mathematischen Wissenschaften [Fundamental Principles of Mathematical Sciences, 1998.
DOI : 10.1007/978-3-642-02431-3

M. Schmidt, N. L. Roux, and F. Bach, Minimizing finite sums with the stochastic average gradient, Mathematical Programming, vol.24, issue.2, pp.83-112, 2017.
DOI : 10.1007/s10107-016-1030-6

URL : https://hal.archives-ouvertes.fr/hal-00860051

P. Tseng, On accelerated proximal gradient methods for convex-concave optimization, 2008.

B. E. Woodworth and N. Srebro, Tight complexity bounds for optimizing composite objectives, Advances in Neural Information Processing Systems (NIPS), 2016.

L. Xiao and T. Zhang, A Proximal Stochastic Gradient Method with Progressive Variance Reduction, SIAM Journal on Optimization, vol.24, issue.4, pp.2057-2075, 2014.
DOI : 10.1137/140961791

URL : http://arxiv.org/abs/1403.4699

H. Zou and T. Hastie, Regularization and variable selection via the elastic net, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.5, issue.2, pp.301-320, 2005.
DOI : 10.1073/pnas.201162998

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.124.4696