On some entropy functionals derived from Rényi information divergence

Abstract : We consider the maximum entropy problems associated with Rényi $Q$-entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are derived and characterized. The Rényi entropy of the optimum distributions can be viewed as a function of the constraint. This defines two families of entropy functionals in the space of possible expected values. General properties of these functionals, including nonnegativity, minimum, convexity, are documented. Their relationships as well as numerical aspects are also discussed. Finally, we work out some specific cases for the reference measure $Q(x)$ and recover in a limit case some well-known entropies.
Liste complète des métadonnées

Cited literature [36 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-00276749
Contributor : Jean-François Bercher <>
Submitted on : Thursday, May 1, 2008 - 7:37:11 PM
Last modification on : Wednesday, April 11, 2018 - 12:12:02 PM
Document(s) archivé(s) le : Friday, May 28, 2010 - 6:09:58 PM

Files

main.pdf
Files produced by the author(s)

Identifiers

Citation

Jean-François Bercher. On some entropy functionals derived from Rényi information divergence. Information Sciences, Elsevier, 2008, 178 (12), pp.2489-2506. ⟨10.1016/j.ins.2008.02.003⟩. ⟨hal-00276749⟩

Share

Metrics

Record views

521

Files downloads

384