Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2014

Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences

Résumé

We consider fitting uncategorical data to a parametric family of distributions by means of tests based on (h, ϕ)-divergence estimates. The class of (h, ϕ)-divergences, introduced in Salicrú et al. (1993), includes the well-known classes of φ-divergences, of Bregman divergences and of distortion measures. The most classic are Kullback-Leibler, Rényi and Tsallis divergences. Most of (h, ϕ)-divergences are associated to (h, ϕ)-entropy functionals, e.g., Kullback-Leibler divergence to Shannon entropy. Distributions maximizing (h, ϕ)-entropies under moment constraints are involved in numerous applications and are also of theoretic interest. Besides the family of exponential distributions maximizing Shannon entropy, see, e.g., Bercher (2014) for an overview of various information inequalities involving the so-called q-Gaussian distributions, i.e., distributions maximizing Rényi (or Tsallis) entropy under variance constraints. For distributions maximizing Shannon or Rényi entropy under moment constraints, the related divergence is well known to reduce to an entropy difference. Then estimating divergence reduces to estimating entropy; see Girardin and Lequesne (2013a, 2013b). A commonly used non-parametric procedure for estimating entropy is the nearest neighbors method; see Vasicek (1976) for Shannon entropy and Leonenko et al. (2008) for Rényi entropy. Vasicek (1976) deduced a test of normality whose statistics involves Shannon entropy difference, thus opening the way to numerous authors who adapted or extended the procedure to obtain goodness-of-fit tests for various sub-families of exponential distributions. Recently, Girardin and Lequesne (2013b) considered goodness-of-fit tests for q-Gaussian distribu-tions (among which the non-standard Student distribution arises as a meaningful example) based on Rényi's divergence and entropy differences. Further, we will show how this methodology may extend to families of distributions maximizing other (h, ϕ)-entropies.
Fichier principal
Vignette du fichier
Regnault_AbstractMaxEnt2014v2.pdf (49.75 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01087568 , version 1 (26-11-2014)

Identifiants

  • HAL Id : hal-01087568 , version 1

Citer

Jean-François Bercher, V Girardin, J Lequesne, Ph Regnault. Goodness-of-fit tests based on (h, ϕ)-divergences and entropy differences. 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, A. Djafari and F. Barbaresco, Sep 2014, Amboise, France. ⟨hal-01087568⟩
373 Consultations
181 Téléchargements

Partager

Gmail Facebook X LinkedIn More