K. Ali and M. J. Pazzani, On the link between error correlation and error reduction in decision tree ensemble. Information and, 1995.

S. Bernard, S. Adam, and L. Heutte, Dynamic random forests, Pattern Recognition Letters, vol.33, issue.12, pp.1580-1586, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00710083

. Rk-bock, M. Chilingarian, . Gaug, T. Hakl, M. Hengstebeck et al., Methods for multidimensional event classification: a case study using images from a cherenkov gamma-ray telescope. Nuclear Instruments and Methods in, Spectrometers, Detectors and Associated Equipment, vol.516, issue.2-3, pp.511-528, 2004.

M. Alfred-m-bruckstein, M. Elad, and . Zibulevsky, Sparse non-negative solution of a linear system of equations is unique, 3rd International Symposium on Communications, Control and Signal Processing, pp.762-767, 2008.

G. Brown and L. I. Kuncheva, good" and "bad" diversity in majority vote ensembles, Multiple Classifier Systems, pp.124-133, 2010.

A. David, E. Belsley, R. E. Kuh, and . Welsch, Regression diagnostics: Identifying influential data and sources of collinearity, vol.571, 1980.

L. Buitinck, G. Louppe, M. Blondel, F. Pedregosa, A. Mueller et al., API design for machine learning software: experiences from the scikit-learn project, ECML PKDD Workshop: Languages for Data Mining and Machine Learning, pp.108-122, 2013.
URL : https://hal.archives-ouvertes.fr/hal-00856511

L. Breiman, Random forests. Machine learning, vol.45, pp.5-32, 2001.

M. Buscema, Metanet*: The theory of independent judges. Substance use & misuse, vol.33, pp.439-461, 1998.

L. F. Cranor and B. A. Lamacchia, Spam! Communications of the ACM, vol.41, issue.8, pp.74-83, 1998.

C. Cortes, M. Mohri, and D. Storcheus, Regularized gradient boosting, Advances in Neural Information Processing Systems, vol.32, pp.5449-5458, 2019.

R. Caruana, A. Niculescu-mizil, G. Crew, and A. Ksikes, Ensemble selection from libraries of models, Proceedings of the twenty-first international conference on Machine learning, p.18, 2004.

. Peter-i-corke, A robotics toolbox for matlab, IEEE Robotics & Automation Magazine, vol.3, issue.1, pp.24-32, 1996.

G. Davis, S. Mallat, and M. Avellaneda, Adaptive greedy approximations. Constructive approximation, vol.13, pp.57-98, 1997.

T. Bradley-efron, I. Hastie, R. Johnstone, and . Tibshirani, Least angle regression. The Annals of statistics, vol.32, pp.407-499, 2004.

K. Fawagreh, M. M. Gaber, and E. Elyan, On extreme pruning of random forest ensembles for real-time predictive applications, 2015.

K. Fawagreh, M. M. Gaber, and E. Elyan, On Extreme Pruning of Random Forest Ensembles for Real-time Predictive Applications. arXiv e-prints, 2015.

H. Jerome and . Friedman, Greedy function approximation: a gradient boosting machine, Annals of statistics, pp.1189-1232, 2001.

R. Gribonval and P. Vandergheynst, On the exponential convergence of matching pursuits in quasi-incoherent dictionaries, IEEE Transactions on Information Theory, vol.52, issue.1, pp.255-261, 2005.
URL : https://hal.archives-ouvertes.fr/inria-00544945

B. Gary, M. Huang, T. Mattar, E. Berg, and . Learned-miller, Labeled faces in the wild: A database forstudying face recognition in unconstrained environments, 2008.

T. Head, L. Mechcoder, and I. Shcherbatyi, , 2018.

Q. Hu, D. Yu, Z. Xie, and X. Li, Eros: Ensemble rough subspaces, Pattern recognition, vol.40, issue.12, pp.3728-3739, 2007.

Y. Vrushali, . Kulkarni, K. Pradeep, and . Sinha, Pruning of random forest classifiers: A survey and future directions, 2012 International Conference on Data Science & Engineering (ICDSE), pp.64-68, 2012.

F. Locatello, A. Raj, G. Sai-praneeth-karimireddy, B. Rätsch, S. U. Schölkopf et al., On matching pursuit and coordinate descent, 2018.

L. Mason, J. Baxter, L. Peter, . Bartlett, and . Marcus-r-frean, Boosting algorithms as gradient descent, Advances in neural information processing systems, pp.512-518, 2000.

N. Olvi-l-mangasarian, W. H. Street, and . Wolberg, Breast cancer diagnosis and prognosis via linear programming, Operations Research, vol.43, issue.4, pp.570-577, 1995.

K. Pace and R. Barry, Sparse spatial autoregressions, Statistics & Probability Letters, vol.33, issue.3, pp.291-297, 1997.

Y. Chandra and P. , Ramin Rezaiifar, and Perinkulam Sambamurthy Krishnaprasad. Orthogonal matching pursuit: Recursive function approximation with applications to wavelet decomposition, Proceedings of 27th Asilomar conference on signals, systems and computers, pp.40-44, 1993.

L. Rokach, Collective-agreement-based pruning of ensembles, Computational Statistics & Data Analysis, vol.53, issue.4, pp.1015-1026, 2009.

D. Alen and . Shapiro, Structured induction in expert systems, 1987.

A. Joel and . Tropp, Greed is good: Algorithmic results for sparse approximation, IEEE Transactions on Information theory, vol.50, issue.10, pp.2231-2242, 2004.

M. Wang and H. Zhang, Search for the smallest random forest, Statistics and its Interface, vol.2, issue.3, pp.381-388, 2009.

F. Yang, W. Lu, L. Luo, and T. Li, Margin optimization based pruning for random forest, Neurocomputing, vol.94, pp.54-63, 2012.

H. Zhang and M. Wang, Search for the smallest random forest, Statistics and its Interface, vol.2, issue.3, p.381, 2009.