R. E. Schapire, A brief introduction to boosting, Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence, 1999.

L. Breiman, Bagging predictors, Machine Learning, vol.10, issue.2, pp.123-140, 1996.
DOI : 10.1007/BF00058655

P. Bühlmann and B. Yu, Nonparametric classification ? analyzing bagging, Annals of Statistics, vol.30, issue.4, pp.927-961, 2002.

L. Breiman, Random forests, Machine Learning, vol.45, issue.1, pp.5-32, 2001.
DOI : 10.1023/A:1010933404324

L. Breiman, Using iterated bagging to debias regressions, Machine Learning, vol.45, issue.3, pp.261-277, 2001.
DOI : 10.1023/A:1017934522171

Y. Freund and R. E. Schapire, A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting, Journal of Computer and System Sciences, vol.55, issue.1, pp.119-139, 1997.
DOI : 10.1006/jcss.1997.1504

R. E. Schapire, Y. Freund, P. Bartlett, and W. S. Lee, Boosting the margin: a new explanation for the effectiveness of voting methods, The Annals of Statistics, vol.26, issue.5, pp.1651-1686, 1998.
DOI : 10.1214/aos/1024691352

J. Friedman, T. Hastie, and R. Tibshirani, Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors), The Annals of Statistics, vol.28, issue.2, pp.337-407, 2000.
DOI : 10.1214/aos/1016218223

G. Lugosi and N. Vayatis, On the Bayes-risk consistency of regularized boosting methods, Ann. Statist, vol.32, issue.1, pp.30-55, 2004.
URL : https://hal.archives-ouvertes.fr/hal-00102140

G. Blanchard, G. Lugosi, and N. Vayatis, On the rate of convergence of regularized boosting classifiers, Journal of Machine Learning Research (Special issue on learning theory), vol.4, pp.861-894, 2003.

L. Breiman, Arcing classifiers (with discussion) The Annals of, Statistics, vol.26, issue.3, pp.801-849, 1998.

J. Friedman, machine., The Annals of Statistics, vol.29, issue.5, pp.1189-1232, 2001.
DOI : 10.1214/aos/1013203451

R. S. Zemel and T. Pitassi, A gradient-based boosting algorithm for regression problems, Advances in Neural Information Processing Systems, no. 13 in NIPS-13, pp.696-702, 2001.

G. Ridgeway, D. Madigan, and T. Richardson, Boosting methodology for regression problems, Proc. of the 7th Int. Workshop on Artificial Intelligence and Statistics, 1999.

H. Drucker, Improving regressors using boosting techniques, Proc. of the 14th Int. Conf. on Machine Learning, pp.107-115, 1997.

S. Borra and A. D. Ciacco, Improving nonparametric regression methods by bagging and boosting, Computational Statistics & Data Analysis, vol.38, issue.4, pp.407-420, 2002.
DOI : 10.1016/S0167-9473(01)00068-8

J. Friedman, Multivariate Adaptive Regression Splines, The Annals of Statistics, vol.19, issue.1, pp.1-141, 1991.
DOI : 10.1214/aos/1176347963

N. Chèze, J. Poggi, and B. Portier, Partial and recombined estimators for nonlinear additive models, Statistical Inference for Stochastic Processes, vol.6, issue.2, pp.155-197, 2003.
DOI : 10.1023/A:1023940117323

B. Ghattas, Prévision des pics d'ozone par arbres de régression, simples et agrégés par bootstrap, pp.61-80, 1999.

O. Bousquet and A. Elisseeff, Stability and generalization, Journal of Machine Learning Research, vol.2, pp.499-526, 2002.

L. Breiman, Heuristics of instability and stabilization in model selection, The Annals of Statistics, vol.24, issue.6, pp.2350-2383, 1996.
DOI : 10.1214/aos/1032181158

S. Gey and J. Poggi, Boosting and instability for regression trees, Computational Statistics & Data Analysis, vol.50, issue.2, 2002.
DOI : 10.1016/j.csda.2004.09.001

URL : https://hal.archives-ouvertes.fr/hal-00326558