HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Journal articles

On some entropy functionals derived from Rényi information divergence

Abstract : We consider the maximum entropy problems associated with Rényi $Q$-entropy, subject to two kinds of constraints on expected values. The constraints considered are a constraint on the standard expectation, and a constraint on the generalized expectation as encountered in nonextensive statistics. The optimum maximum entropy probability distributions, which can exhibit a power-law behaviour, are derived and characterized. The Rényi entropy of the optimum distributions can be viewed as a function of the constraint. This defines two families of entropy functionals in the space of possible expected values. General properties of these functionals, including nonnegativity, minimum, convexity, are documented. Their relationships as well as numerical aspects are also discussed. Finally, we work out some specific cases for the reference measure $Q(x)$ and recover in a limit case some well-known entropies.
Complete list of metadata

Cited literature [36 references]  Display  Hide  Download

Contributor : Jean-François Bercher Connect in order to contact the contributor
Submitted on : Thursday, May 1, 2008 - 7:37:11 PM
Last modification on : Saturday, January 15, 2022 - 3:58:48 AM
Long-term archiving on: : Friday, May 28, 2010 - 6:09:58 PM


Files produced by the author(s)



Jean-François Bercher. On some entropy functionals derived from Rényi information divergence. Information Sciences, Elsevier, 2008, 178 (12), pp.2489-2506. ⟨10.1016/j.ins.2008.02.003⟩. ⟨hal-00276749⟩



Record views


Files downloads