F. Bunea, A. Tsybakov, and M. Wegkamp, Sparsity oracle inequalities for the Lasso, Electronic Journal of Statistics, vol.1, issue.0, pp.169-194, 2007.
DOI : 10.1214/07-EJS008

URL : https://hal.archives-ouvertes.fr/hal-00160646

G. Casella and R. L. Berger, Statistical inference. The Wadsworth & Brooks/Cole Statistics/Probability Series, 1990.

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression -with discussion, Ann. Statist, vol.32, issue.2, pp.407-499, 2004.

P. Garrigues and L. Ghaoui, An homotopy algorithm for the lasso with online observations, To appear in Neural Information Processing Systems (NIPS) 21, 2008.

L. Györfi, M. Kohler, A. Krzy?, and H. Walk, A distribution-free theory of nonparametric regression, 2002.
DOI : 10.1007/b97848

M. Hebiri, Regularization with the smooth-lasso procedure, 2008.
URL : https://hal.archives-ouvertes.fr/hal-00260816

K. Knight and W. Fu, Asymptotics for lasso-type estimators, Ann. Statist, vol.28, issue.5, pp.1356-1378, 2000.

N. Meinshausen and P. Bühlmann, High-dimensional graphs and variable selection with the Lasso, The Annals of Statistics, vol.34, issue.3, pp.1436-1462, 2006.
DOI : 10.1214/009053606000000281

S. Rosset and J. Zhu, Piecewise linear regularized solution paths, The Annals of Statistics, vol.35, issue.3, pp.1012-1030, 2007.
DOI : 10.1214/009053606000001370

R. Tibshirani, Regression shrinkage and selection via the lasso, J. Roy. Statist. Soc. Ser. B, vol.58, issue.1, pp.267-288, 1996.

R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, and K. Knight, Sparsity and smoothness via the fused lasso, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.99, issue.1, pp.91-108, 2005.
DOI : 10.1016/S0140-6736(02)07746-2

V. Vapnik, Statistical learning theory Adaptive and Learning Systems for Signal Processing , Communications, and Control, 1998.

V. Vovk, Asymptotic Optimality of Transductive Confidence Machine, Algorithmic learning theory, pp.336-350, 2002.
DOI : 10.1007/3-540-36169-3_27

V. Vovk, On-line confidence machines are well-calibrated, The 43rd Annual IEEE Symposium on Foundations of Computer Science, 2002. Proceedings., pp.187-196, 2002.
DOI : 10.1109/SFCS.2002.1181895

V. Vovk, A. Gammerman, and C. Saunders, Machine-learning applications of algorithmic randomness, Proceedings of the Sixteenth International Conference on Machine Learning, pp.444-453, 1999.

V. Vovk, A. Gammerman, and G. Shafer, Algorithmic learning in a random world, 2005.

V. Vovk, G. N. Ilia, and A. Gammerman, On-line predictive linear regression, The Annals of Statistics, vol.37, issue.3, 2007.
DOI : 10.1214/08-AOS622

M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.58, issue.1, pp.49-67, 2006.
DOI : 10.1198/016214502753479356

P. Zhao and B. Yu, On model selection consistency of Lasso, J. Mach. Learn. Res, vol.7, pp.2541-2563, 2006.

H. Zou, The Adaptive Lasso and Its Oracle Properties, Journal of the American Statistical Association, vol.101, issue.476, pp.1418-1429, 2006.
DOI : 10.1198/016214506000000735

H. Zou and T. Hastie, Regularization and variable selection via the elastic net, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.5, issue.2, pp.301-320, 2005.
DOI : 10.1073/pnas.201162998

H. Zou, T. Hastie, and R. Tibshirani, On the ???degrees of freedom??? of the lasso, The Annals of Statistics, vol.35, issue.5, pp.2173-2192, 2007.
DOI : 10.1214/009053607000000127