HAL will be down for maintenance from Friday, June 10 at 4pm through Monday, June 13 at 9am. More information
Skip to Main content Skip to Navigation
Conference papers

Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences

Abstract : We consider fitting uncategorical data to a parametric family of distributions by means of tests based on (h, ϕ)-divergence estimates. The class of (h, ϕ)-divergences, introduced in Salicrú et al. (1993), includes the well-known classes of φ-divergences, of Bregman divergences and of distortion measures. The most classic are Kullback-Leibler, Rényi and Tsallis divergences. Most of (h, ϕ)-divergences are associated to (h, ϕ)-entropy functionals, e.g., Kullback-Leibler divergence to Shannon entropy. Distributions maximizing (h, ϕ)-entropies under moment constraints are involved in numerous applications and are also of theoretic interest. Besides the family of exponential distributions maximizing Shannon entropy, see, e.g., Bercher (2014) for an overview of various information inequalities involving the so-called q-Gaussian distributions, i.e., distributions maximizing Rényi (or Tsallis) entropy under variance constraints. For distributions maximizing Shannon or Rényi entropy under moment constraints, the related divergence is well known to reduce to an entropy difference. Then estimating divergence reduces to estimating entropy; see Girardin and Lequesne (2013a, 2013b). A commonly used non-parametric procedure for estimating entropy is the nearest neighbors method; see Vasicek (1976) for Shannon entropy and Leonenko et al. (2008) for Rényi entropy. Vasicek (1976) deduced a test of normality whose statistics involves Shannon entropy difference, thus opening the way to numerous authors who adapted or extended the procedure to obtain goodness-of-fit tests for various sub-families of exponential distributions. Recently, Girardin and Lequesne (2013b) considered goodness-of-fit tests for q-Gaussian distribu-tions (among which the non-standard Student distribution arises as a meaningful example) based on Rényi's divergence and entropy differences. Further, we will show how this methodology may extend to families of distributions maximizing other (h, ϕ)-entropies.
Complete list of metadata

Cited literature [2 references]  Display  Hide  Download

Contributor : Jean-François Bercher Connect in order to contact the contributor
Submitted on : Wednesday, November 26, 2014 - 12:40:07 PM
Last modification on : Saturday, January 15, 2022 - 3:57:59 AM
Long-term archiving on: : Friday, February 27, 2015 - 11:35:54 AM


Files produced by the author(s)


  • HAL Id : hal-01087568, version 1


Jean-François Bercher, V Girardin, J Lequesne, Ph Regnault. Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences. 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, A. Djafari and F. Barbaresco, Sep 2014, Amboise, France. ⟨hal-01087568⟩



Record views


Files downloads