Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences

Abstract : We consider fitting uncategorical data to a parametric family of distributions by means of tests based on (h, ϕ)-divergence estimates. The class of (h, ϕ)-divergences, introduced in Salicrú et al. (1993), includes the well-known classes of φ-divergences, of Bregman divergences and of distortion measures. The most classic are Kullback-Leibler, Rényi and Tsallis divergences. Most of (h, ϕ)-divergences are associated to (h, ϕ)-entropy functionals, e.g., Kullback-Leibler divergence to Shannon entropy. Distributions maximizing (h, ϕ)-entropies under moment constraints are involved in numerous applications and are also of theoretic interest. Besides the family of exponential distributions maximizing Shannon entropy, see, e.g., Bercher (2014) for an overview of various information inequalities involving the so-called q-Gaussian distributions, i.e., distributions maximizing Rényi (or Tsallis) entropy under variance constraints. For distributions maximizing Shannon or Rényi entropy under moment constraints, the related divergence is well known to reduce to an entropy difference. Then estimating divergence reduces to estimating entropy; see Girardin and Lequesne (2013a, 2013b). A commonly used non-parametric procedure for estimating entropy is the nearest neighbors method; see Vasicek (1976) for Shannon entropy and Leonenko et al. (2008) for Rényi entropy. Vasicek (1976) deduced a test of normality whose statistics involves Shannon entropy difference, thus opening the way to numerous authors who adapted or extended the procedure to obtain goodness-of-fit tests for various sub-families of exponential distributions. Recently, Girardin and Lequesne (2013b) considered goodness-of-fit tests for q-Gaussian distribu-tions (among which the non-standard Student distribution arises as a meaningful example) based on Rényi's divergence and entropy differences. Further, we will show how this methodology may extend to families of distributions maximizing other (h, ϕ)-entropies.
Type de document :
Communication dans un congrès
A. Djafari and F. Barbaresco. 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Sep 2014, Amboise, France. 2014, 〈https://www.see.asso.fr/node/9587〉
Liste complète des métadonnées

Littérature citée [2 références]  Voir  Masquer  Télécharger

https://hal.archives-ouvertes.fr/hal-01087568
Contributeur : Jean-François Bercher <>
Soumis le : mercredi 26 novembre 2014 - 12:40:07
Dernière modification le : mardi 3 octobre 2017 - 15:10:02
Document(s) archivé(s) le : vendredi 27 février 2015 - 11:35:54

Fichier

Regnault_AbstractMaxEnt2014v2....
Fichiers produits par l'(les) auteur(s)

Identifiants

  • HAL Id : hal-01087568, version 1

Citation

Jean-François Bercher, V Girardin, J Lequesne, Ph Regnault. Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences. A. Djafari and F. Barbaresco. 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, Sep 2014, Amboise, France. 2014, 〈https://www.see.asso.fr/node/9587〉. 〈hal-01087568〉

Partager

Métriques

Consultations de la notice

302

Téléchargements de fichiers

110