HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Smooth And Consistent Probabilistic Regression Trees

Abstract : We propose here a generalization of regression trees, referred to as Probabilistic Regression (PR) trees, that adapt to the smoothness of the prediction function relating input and output variables while preserving the interpretability of the prediction and being robust to noise. In PR trees, an observation is associated to all regions of a tree through a probability distribution that reflects how far the observation is to a region. We show that such trees are consistent, meaning that their error tends to 0 when the sample size tends to infinity, a property that has not been established for similar, previous proposals as Soft trees and Smooth Transition Regression trees. We further explain how PR trees can be used in different ensemble methods, namely Random Forests and Gradient Boosted Trees. Lastly, we assess their performance through extensive experiments that illustrate their benefits in terms of performance, interpretability and robustness to noise.
Document type :
Conference papers
Complete list of metadata

Contributor : Myriam Tami Connect in order to contact the contributor
Submitted on : Thursday, December 10, 2020 - 9:53:13 AM
Last modification on : Tuesday, January 4, 2022 - 6:31:44 AM
Long-term archiving on: : Thursday, March 11, 2021 - 6:50:02 PM


Files produced by the author(s)


  • HAL Id : hal-03050168, version 1


Sami Alkhoury, Emilie Devijver, Marianne Clausel, Myriam Tami, Éric Gaussier, et al.. Smooth And Consistent Probabilistic Regression Trees. NeurIPS 2020 - 34th International Conference on Neural Information Processing Systems, Dec 2020, Virtuelle, France. pp.1-11. ⟨hal-03050168⟩



Record views


Files downloads