Skip to Main content Skip to Navigation
Journal articles

Generalized divergences from generalized entropies

Abstract : Several quantifiers of information, also known as entropies, have been introduced in different contexts and from different motivations. For almost each one of these entropies, a measure of the loss (or gain) of information has been introduced. In this work we introduce generalized weighted divergences associated with an arbitrary entropy. The resulting measures are closely related to Bregman divergences. We study the main formal properties of the resulting divergences, we extend them to weighted probability distributions and we apply some of them to the analysis of simulated and real time series.
Complete list of metadata

https://hal.archives-ouvertes.fr/hal-01987470
Contributor : Steeve Zozor Connect in order to contact the contributor
Submitted on : Monday, January 21, 2019 - 10:43:03 AM
Last modification on : Wednesday, November 3, 2021 - 5:13:21 AM

Identifiers

Collections

Citation

Leonardo Riveaud, Diego Mateos, Steeve Zozor, Pedro Lamberti. Generalized divergences from generalized entropies. Physica A: Statistical Mechanics and its Applications, Elsevier, 2018, 510, pp.68-76. ⟨10.1016/j.physa.2018.06.111⟩. ⟨hal-01987470⟩

Share

Metrics

Les métriques sont temporairement indisponibles