Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences

Abstract : We consider fitting uncategorical data to a parametric family of distributions by means of tests based on (h, ϕ)-divergence estimates. The class of (h, ϕ)-divergences, introduced in Salicrú et al. (1993), includes the well-known classes of φ-divergences, of Bregman divergences and of distortion measures. The most classic are Kullback-Leibler, Rényi and Tsallis divergences. Most of (h, ϕ)-divergences are associated to (h, ϕ)-entropy functionals, e.g., Kullback-Leibler divergence to Shannon entropy. Distributions maximizing (h, ϕ)-entropies under moment constraints are involved in numerous applications and are also of theoretic interest. Besides the family of exponential distributions maximizing Shannon entropy, see, e.g., Bercher (2014) for an overview of various information inequalities involving the so-called q-Gaussian distributions, i.e., distributions maximizing Rényi (or Tsallis) entropy under variance constraints. For distributions maximizing Shannon or Rényi entropy under moment constraints, the related divergence is well known to reduce to an entropy difference. Then estimating divergence reduces to estimating entropy; see Girardin and Lequesne (2013a, 2013b). A commonly used non-parametric procedure for estimating entropy is the nearest neighbors method; see Vasicek (1976) for Shannon entropy and Leonenko et al. (2008) for Rényi entropy. Vasicek (1976) deduced a test of normality whose statistics involves Shannon entropy difference, thus opening the way to numerous authors who adapted or extended the procedure to obtain goodness-of-fit tests for various sub-families of exponential distributions. Recently, Girardin and Lequesne (2013b) considered goodness-of-fit tests for q-Gaussian distribu-tions (among which the non-standard Student distribution arises as a meaningful example) based on Rényi's divergence and entropy differences. Further, we will show how this methodology may extend to families of distributions maximizing other (h, ϕ)-entropies.
Liste complète des métadonnées

Cited literature [2 references]  Display  Hide  Download

https://hal.archives-ouvertes.fr/hal-01087568
Contributor : Jean-François Bercher <>
Submitted on : Wednesday, November 26, 2014 - 12:40:07 PM
Last modification on : Thursday, February 7, 2019 - 5:32:34 PM
Document(s) archivé(s) le : Friday, February 27, 2015 - 11:35:54 AM

File

Regnault_AbstractMaxEnt2014v2....
Files produced by the author(s)

Identifiers

  • HAL Id : hal-01087568, version 1

Citation

Jean-François Bercher, V Girardin, J Lequesne, Ph Regnault. Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences. A. Djafari and F. Barbaresco. 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Sep 2014, Amboise, France. 2014, 〈https://www.see.asso.fr/node/9587〉. 〈hal-01087568〉

Share

Metrics

Record views

536

Files downloads

165