Estimation of entropies and divergences via nearest neighbors
Résumé
We extend the results by Leonenko et al. and show how k-th nearest-neighbor distances in a sample of N i.i.d. vectors distributed with the probability density f can be used to estimate consistently the Rényi and Tsallis entropies of the unknown f under minimal assumptions. The method is extended to the estimation of statistical distances between two distributions in the case when one i.i.d. sample from each is available.
Origine : Accord explicite pour ce dépôt
Loading...