A. Beck and M. Teboulle, A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems, SIAM Journal on Imaging Sciences, vol.2, issue.1, pp.183-202, 2009.
DOI : 10.1137/080716542

Y. Nesterov, Gradient methods for minimizing composite objective function, CORE Discussion Papers, issue.76, 2007.

R. Tibshirani, Regression shrinkage and selection via the Lasso, Journal of the Royal Statistical Society: Series B, vol.58, issue.1, pp.267-288, 1996.

S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic Decomposition by Basis Pursuit, SIAM Journal on Scientific Computing, vol.20, issue.1, pp.33-61, 1998.
DOI : 10.1137/S1064827596304010

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.113.7694

S. J. Wright, R. D. Nowak, and M. A. Figueiredo, Sparse Reconstruction by Separable Approximation, IEEE Transactions on Signal Processing, vol.57, issue.7, pp.2479-2493, 2009.
DOI : 10.1109/TSP.2009.2016892

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.115.9334

F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Convex optimization with sparsity-inducing norms, Optimization for Machine Learning, 2011.
DOI : 10.1561/2200000015

URL : https://hal.archives-ouvertes.fr/hal-00937150

J. Fadili and G. Peyré, Total Variation Projection With First Order Schemes, IEEE Transactions on Image Processing, vol.20, issue.3, pp.657-669, 2011.
DOI : 10.1109/TIP.2010.2072512

URL : https://hal.archives-ouvertes.fr/hal-00401251

X. Chen, S. Kim, Q. Lin, J. G. Carbonell, and E. P. Xing, Graph-structured multi-task regression and an efficient optimization method for general fused Lasso, 2010.

J. Cai, E. J. Candès, and Z. Shen, A Singular Value Thresholding Algorithm for Matrix Completion, SIAM Journal on Optimization, vol.20, issue.4, 2010.
DOI : 10.1137/080738970

S. Ma, D. Goldfarb, and L. Chen, Fixed point and Bregman iterative methods for matrix rank minimization, Mathematical Programming, vol.1, issue.1, pp.321-353, 2011.
DOI : 10.1007/s10107-009-0306-5

L. Jacob, G. Obozinski, and J. Vert, Group lasso with overlap and graph lasso, Proceedings of the 26th Annual International Conference on Machine Learning, ICML '09, 2009.
DOI : 10.1145/1553374.1553431

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.149.7108

R. Jenatton, J. Mairal, G. Obozinski, and F. Bach, Proximal methods for sparse hierarchical dictionary learning, JMLR, vol.12, pp.2297-2334, 2011.

A. Barbero and S. Sra, Fast Newton-type methods for total variation regularization, 2011.

J. Liu and J. Ye, Fast overlapping group Lasso, 2010.

M. Schmidt and K. Murphy, Convex structure learning in log-linear models: Beyond pairwise potentials, AISTATS, 2010.

M. Patriksson, A unified framework of descent algorithms for nonlinear programs and variational inequalities, 1995.

P. L. Combettes, Solving monotone inclusions via compositions of nonexpansive averaged operators . Optimization, pp.475-504, 2004.
URL : https://hal.archives-ouvertes.fr/hal-00017830

J. Duchi and Y. Singer, Efficient online and batch learning using forward backward splitting, JMLR, vol.10, pp.2873-2898, 2009.

J. Langford, L. Li, and T. Zhang, Sparse online learning via truncated gradient, JMLR, vol.10, pp.777-801, 2009.

M. Baes, Estimate sequence methods: extensions and approximations, 2009.

O. Devolder, F. Glineur, and Y. Nesterov, First-order methods of smooth convex optimization with inexact oracle, Mathematical Programming, vol.110, issue.3, 2011.
DOI : 10.1007/s10107-013-0677-5

A. Nedic and D. Bertsekas, Convergence rate of incremental subgradient algorithms. Stochastic Optimization: Algorithms and Applications, pp.263-304, 2000.

Z. Luo and P. Tseng, Error bounds and convergence analysis of feasible descent methods: a general approach, Annals of Operations Research, vol.29, issue.1, pp.46-47157, 1993.
DOI : 10.1007/BF02096261

M. P. Friedlander and M. Schmidt, Hybrid deterministic-stochastic methods for data fitting, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00626571

R. T. Rockafellar, Monotone Operators and the Proximal Point Algorithm, SIAM Journal on Control and Optimization, vol.14, issue.5, pp.877-898, 1976.
DOI : 10.1137/0314056

O. Güler, New Proximal Point Algorithms for Convex Minimization, SIAM Journal on Optimization, vol.2, issue.4, pp.649-664, 1992.
DOI : 10.1137/0802032

S. Villa, S. Salzo, L. Baldassarre, and A. Verri, Accelerated and inexact forward-backward algorithms. Optimization Online, 2011.
DOI : 10.1137/110844805

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.416.3633

K. Jiang, D. Sun, and K. C. Toh, An inexact accelerated proximal gradient method for large scale linearly constrained convex SDP. Optimization Online, 2011.

Y. Nesterov, Introductory Lectures on Convex Optimization: A Basic Course, 2004.
DOI : 10.1007/978-1-4419-8853-9

D. P. Bertsekas, Convex optimization theory, Athena Scientific, 2009.

P. Tseng, On accelerated proximal gradient methods for convex-concave optimization, 2008.

J. Mairal, R. Jenatton, G. Obozinski, and F. Bach, Convex and network flow optimization for structured sparsity, JMLR, vol.12, pp.2681-2720, 2011.
URL : https://hal.archives-ouvertes.fr/inria-00584817

H. H. Bauschke and P. L. Combettes, A Dykstra-like algorithm for two monotone operators, Pacific Journal of Optimization, vol.4, issue.3, pp.383-391, 2008.

Y. Nesterov, Smooth minimization of non-smooth functions, Mathematical Programming, vol.269, issue.1, pp.127-152, 2005.
DOI : 10.1007/s10107-004-0552-5

P. L. Combettes and J. Pesquet, Proximal Splitting Methods in Signal Processing, Fixed-Point Algorithms for Inverse Problems in Science and Engineering, pp.185-212, 2011.
DOI : 10.1007/978-1-4419-9569-8_10

URL : https://hal.archives-ouvertes.fr/hal-00643807

M. J. Wainwright, T. S. Jaakkola, and A. S. Willsky, Tree-reweighted belief propagation algorithms and approximate ML estimation by pseudo-moment matching, AISTATS, 2003.

J. Kivinen, A. J. Smola, and R. C. Williamson, Online Learning with Kernels, IEEE Transactions on Signal Processing, vol.52, issue.8, pp.2165-2176, 2004.
DOI : 10.1109/TSP.2004.830991

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.11.2062

M. Schmidt, D. Kim, and S. Sra, Projected Newton-type methods in machine learning, Optimization for Machine Learning, 2011.

D. P. Bertsekas, A. Nedi´cnedi´c, and A. E. Ozdaglar, Convex Analysis and Optimization, Athena Scientific, 2003.