Cautious weighted random forests - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Expert Systems with Applications Année : 2023

Cautious weighted random forests

Résumé

Random forest is an efficient and accurate classification model, which makes decisions by aggregating a set of trees, either by voting or by averaging class posterior probability estimates. However, tree outputs may be unreliable in presence of scarce data. The imprecise Dirichlet model (IDM) provides workaround, by replacing point probability estimates with interval-valued ones. This paper investigates a new tree aggregation method based on the theory of belief functions to combine such probability intervals, resulting in a cautious random forest classifier. In particular, we propose a strategy for computing tree weights based on the minimization of a convex cost function, which takes both determinacy and accuracy into account and makes it possible to adjust the level of cautiousness of the model. The proposed model is evaluated on 25 UCI datasets and is demonstrated to be more adaptive to the noise in training data and to achieve a better compromise between informativeness and cautiousness.
Fichier principal
Vignette du fichier
Cautious_Weighted_Random_Forests.pdf (3.78 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03895122 , version 1 (12-12-2022)

Identifiants

Citer

Haifei Zhang, Benjamin Quost, Marie-Hélène Masson. Cautious weighted random forests. Expert Systems with Applications, 2023, 213 (Part A), pp.118883. ⟨10.1016/j.eswa.2022.118883⟩. ⟨hal-03895122⟩
89 Consultations
55 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More