Skip to Main content Skip to Navigation
Journal articles

Extremely Randomized Trees

Abstract : This paper proposes a new tree-based ensemble method for supervised classification and regression problems. It essentially consists of randomizing strongly both attribute and cut-point choice while splitting a tree node. In the extreme case, it builds totally randomized trees whose structures are independent of the output values of the learning sample. The strength of the randomization can be tuned to problem specifics by the appropriate choice of a parameter. We evaluate the robustness of the default choice of this parameter, and we also provide insight on how to adjust it in particular situations. Besides accuracy, the main strength of the resulting algorithm is computational efficiency. A bias/variance analysis of the Extra-Trees algorithm is also provided as well as a geometrical and a kernel characterization of the models induced.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-00341932
Contributor : Frédéric Davesne Connect in order to contact the contributor
Submitted on : Wednesday, November 26, 2008 - 1:08:53 PM
Last modification on : Tuesday, June 30, 2020 - 11:54:04 AM

Links full text

Identifiers

Collections

Citation

Pierre Geurts, Damien Ernst, Louis Wehenkel. Extremely Randomized Trees. Machine Learning, Springer Verlag, 2006, 36, pp.3--42. ⟨10.1007/s10994-006-6226-1⟩. ⟨hal-00341932⟩

Share

Metrics

Record views

1506