HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Journal articles

Rescaling Entropy and Divergence Rates

Loïck Lhote 1 Valérie Girardin 2
1 Equipe AMACC - Laboratoire GREYC - UMR6072
GREYC - Groupe de Recherche en Informatique, Image et Instrumentation de Caen
Abstract : Based on rescaling by some suitable sequence instead of the number of time units, the usual notion of divergence rate is here extended to define and determine meaningful generalized divergence rates. Rescaling entropy rates appears as a special case. Suitable rescaling is naturally induced by the asymptotic behavior of the marginal divergences. Closed form formulas are obtained as soon as the marginal divergences behave like powers of some analytical functions. A wide class of countable Markov chains is proven to satisfy this property. Most divergence and entropy functionals defined in the literature are concerned, e.g., the classical Shannon, Kullback-Leibler, R´enyi, Tsallis. For illustration purposes, Ferreri or Basu-Harris-Hjort-Jones – among others – are also considered.
Complete list of metadata

Contributor : Référent Amacc Référent de l'équipe Amacc - Laboratoire Greyc - Umr6072 Connect in order to contact the contributor
Submitted on : Thursday, September 10, 2015 - 2:41:22 PM
Last modification on : Tuesday, October 19, 2021 - 11:34:56 PM



Loïck Lhote, Valérie Girardin. Rescaling Entropy and Divergence Rates. IEEE Transactions on Information Theory, Institute of Electrical and Electronics Engineers, 2015, 61 (11), pp.5868-5882. ⟨10.1109/TIT.2015.2476486⟩. ⟨hal-01196817⟩



Record views