From Cost-Sensitive Classification to Tight F-measure Bounds - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

From Cost-Sensitive Classification to Tight F-measure Bounds

Kevin Bascol
Rémi Emonet
Elisa Fromont
Amaury Habrard
Guillaume Metzler
Marc Sebban

Résumé

The F-measure is a classification performance measure, especially suited when dealing with imbalanced datasets, which provides a compromise between the precision and the recall of a classifier. As this measure is non convex and non linear, it is often indirectly optimized using cost-sensitive learning (that affects different costs to false positives and false negatives). In this article, we derive theoretical guarantees that give tight bounds on the best F-measure that can be obtained from cost-sensitive learning. We also give an original geometric interpretation of the bounds that serves as an inspiration for CONE, a new algorithm to optimize for the F-measure. Using 10 datasets exhibiting varied class imbalance, we illustrate that our bounds are much tighter than previous work and show that CONE learns models with either superior F-measures than existing methods or comparable but in fewer iterations.
Fichier principal
Vignette du fichier
aistats_2018.pdf (3.77 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-02049763 , version 1 (26-02-2019)

Identifiants

  • HAL Id : hal-02049763 , version 1

Citer

Kevin Bascol, Rémi Emonet, Elisa Fromont, Amaury Habrard, Guillaume Metzler, et al.. From Cost-Sensitive Classification to Tight F-measure Bounds. AISTATS 2019 - 22nd International Conference on Artificial Intelligence and Statistics, Apr 2019, Naha, Okinawa, Japan. pp.1245-1253. ⟨hal-02049763⟩
360 Consultations
205 Téléchargements

Partager

Gmail Facebook X LinkedIn More