Skip to Main content Skip to Navigation
Journal articles

Rescaling Entropy and Divergence Rates

Loïck Lhote 1 Valérie Girardin 2
1 Equipe AMACC - Laboratoire GREYC - UMR6072
GREYC - Groupe de Recherche en Informatique, Image, Automatique et Instrumentation de Caen
Abstract : Based on rescaling by some suitable sequence instead of the number of time units, the usual notion of divergence rate is here extended to define and determine meaningful generalized divergence rates. Rescaling entropy rates appears as a special case. Suitable rescaling is naturally induced by the asymptotic behavior of the marginal divergences. Closed form formulas are obtained as soon as the marginal divergences behave like powers of some analytical functions. A wide class of countable Markov chains is proven to satisfy this property. Most divergence and entropy functionals defined in the literature are concerned, e.g., the classical Shannon, Kullback-Leibler, R´enyi, Tsallis. For illustration purposes, Ferreri or Basu-Harris-Hjort-Jones – among others – are also considered.
Complete list of metadata
Contributor : Référent Amacc Référent de l'Équipe Amacc - Laboratoire Greyc - Umr6072 <>
Submitted on : Thursday, September 10, 2015 - 2:41:22 PM
Last modification on : Monday, April 27, 2020 - 4:14:03 PM



Loïck Lhote, Valérie Girardin. Rescaling Entropy and Divergence Rates. IEEE Transactions on Information Theory, Institute of Electrical and Electronics Engineers, 2015, 61 (11), pp.5868-5882. ⟨10.1109/TIT.2015.2476486⟩. ⟨hal-01196817⟩



Record views