Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition

Résumé

In this paper, we tackle the challenge of jointly quantifying in-distribution and out-of-distribution (OOD) uncertainties. We introduce KLoS, a KL-divergence measure defined on the classprobability simplex. By leveraging the secondorder uncertainty representation provided by evidential models, KLoS captures more than existing first-order uncertainty measures such as predictive entropy. We design an auxiliary neural network, KLoSNet, to learn a refined measure directly aligned with the evidential training objective. Experiments show that KLoSNet acts as a class-wise density estimator and outperforms current uncertainty measures in the realistic context where no OOD data is available during training. We also report comparisons in the presence of OOD training samples, which shed a new light on the impact of the vicinity of this data with OOD test data.
Fichier principal
Vignette du fichier
UDL2021-paper-062.pdf (1.8 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03347628 , version 1 (17-09-2021)

Identifiants

  • HAL Id : hal-03347628 , version 1

Citer

Charles Corbière, Marc Lafon, Nicolas Thome, Matthieu Cord, Patrick Pérez. Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition. ICML 2021 Workshop on Uncertainty and Robustness in Deep Learning, Sep 2021, Virtual, Austria. ⟨hal-03347628⟩
162 Consultations
133 Téléchargements

Partager

Gmail Facebook X LinkedIn More