H. Akaike, Information theory and an extension of the maximum likelihood principle, Second International Symposium on Information Theory, pp.267-281, 1973.

P. J. Bickel, Y. Ritov, and A. Tsybakov, Simultaneous analysis of Lasso and Dantzig selector, The Annals of Statistics, vol.37, issue.4, pp.1705-1732, 2009.
DOI : 10.1214/08-AOS620

URL : https://hal.archives-ouvertes.fr/hal-00401585

P. Craven and G. Wahba, Smoothing noisy data with spline functions, Numerische Mathematik, vol.4, issue.4, pp.377-403, 1979.
DOI : 10.1007/BF01404567

I. Daubechies, M. Defrise, and C. D. Mol, An iterative thresholding algorithm for linear inverse problems with a sparsity constraint, Communications on Pure and Applied Mathematics, vol.58, issue.11, pp.1413-1541, 2004.
DOI : 10.1002/cpa.20042

C. Dossal, A necessary and sufficient condition for exact recovery by l1 minimization, 2007.
URL : https://hal.archives-ouvertes.fr/hal-00164738

B. Efron, The Estimation of Prediction Error, Journal of the American Statistical Association, vol.99, issue.467, pp.619-642, 2004.
DOI : 10.1198/016214504000000692

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression (with discussion), Ann. Statist, vol.32, pp.407-499, 2004.

B. Efron, How Biased is the Apparent Error Rate of a Prediction Rule?, Journal of the American Statistical Association, vol.39, issue.394, pp.461-470, 1981.
DOI : 10.1080/01621459.1986.10478291

J. Fan and R. Li, Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties, Journal of the American Statistical Association, vol.96, issue.456, pp.1348-1360, 2001.
DOI : 10.1198/016214501753382273

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.128.4174

J. Fan and H. Peng, Nonconcave penalized likelihood with a diverging number of parameters. The Annals of Statistics, pp.928-961, 2004.

J. J. Fuchs, On Sparse Representations in Arbitrary Redundant Bases, IEEE Transactions on Information Theory, vol.50, issue.6, pp.1341-1344, 2004.
DOI : 10.1109/TIT.2004.828141

K. Kato, On the degrees of freedom in shrinkage estimation, Journal of Multivariate Analysis, vol.100, issue.7, pp.1338-1352, 2009.
DOI : 10.1016/j.jmva.2008.12.002

F. Luisier, . Epfl, . Lausanne, and . Available, The SURE-LET approach to image denoising, p.4566, 2009.

C. Mallows, Some comments on C p, Technometrics, vol.15, pp.661-675, 1973.
DOI : 10.2307/1267380

M. Meyer and M. Woodroofe, On the degrees of freedom in shape restricted regression, Ann. Statist, vol.28, pp.1083-1104, 2000.

Y. Nardi and . Rinaldo, On the asymptotic properties of the group lasso estimator for linear models, Electronic Journal of Statistics, vol.2, issue.0, pp.605-633, 2008.
DOI : 10.1214/08-EJS200

M. Osborne, B. Presnell, and B. Turlach, A new approach to variable selection in least squares problems, IMA Journal of Numerical Analysis, vol.20, issue.3, pp.389-403, 2000.
DOI : 10.1093/imanum/20.3.389

M. R. Osborne, B. Presnell, and B. Turlach, On the LASSO and its dual, J. Comput. Graph. Statist, vol.9, pp.319-337, 2000.

P. Ravikumar, H. Liu, J. Lafferty, and W. , Spam: Sparse additive models, Advances in Neural Information Processing Systems (NIPS), 2008.
DOI : 10.1111/j.1467-9868.2009.00718.x

S. Rosset, J. Zhu, and T. Hastie, Boosting as a Regularized Path to a Maximum Margin Classifier, J. Mach. Learn. Res, vol.5, pp.941-973, 2004.

S. Sardy, A. Bruce, and P. Tseng, Block coordinate relaxation methods for nonparametric wavelet denoising, J. of Comp. Graph. Stat, vol.9, issue.2, pp.361-379, 2000.

G. Schwarz, Estimating the Dimension of a Model, The Annals of Statistics, vol.6, issue.2, pp.461-464, 1978.
DOI : 10.1214/aos/1176344136

C. Stein, Estimation of the Mean of a Multivariate Normal Distribution, The Annals of Statistics, vol.9, issue.6, pp.1135-1151, 1981.
DOI : 10.1214/aos/1176345632

R. Tibshirani and J. Taylor, The solution path of the generalized lasso, The Annals of Statistics, vol.39, issue.3, 2011.
DOI : 10.1214/11-AOS878SUPP

R. Tibshirani and J. Taylor, Degrees of freedom in lasso problems, The Annals of Statistics, vol.40, issue.2, 2012.
DOI : 10.1214/12-AOS1003

R. Tibshirani, Regression shrinkage and selection via the Lasso, J. Roy. Statist. Soc. Ser. B, vol.58, issue.1, pp.267-288, 1996.

J. A. Tropp, Just relax: convex programming methods for identifying sparse signals in noise, IEEE Transactions on Information Theory, vol.52, issue.3, pp.1030-1051, 2006.
DOI : 10.1109/TIT.2005.864420

S. Vaiter, G. Peyré, C. Dossal, and M. J. Fadili, Robust Sparse Analysis Regularization, IEEE Transactions on Information Theory, vol.59, issue.4, 2011.
DOI : 10.1109/TIT.2012.2233859

URL : https://hal.archives-ouvertes.fr/hal-00627452

M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.58, issue.1, pp.49-67, 2006.
DOI : 10.1198/016214502753479356

P. Zhao and Y. Bin, On model selection consistency of Lasso, Journal of Machine Learning Research, vol.7, pp.2541-2563, 2006.

H. Zou, The Adaptive Lasso and Its Oracle Properties, Journal of the American Statistical Association, vol.101, issue.476, pp.1418-1429, 2006.
DOI : 10.1198/016214506000000735

H. Zou, T. Hastie, and R. Tibshirani, On the ???degrees of freedom??? of the lasso, The Annals of Statistics, vol.35, issue.5, pp.2173-2192, 2007.
DOI : 10.1214/009053607000000127