I. A. Ahmad and P. E. Lin, A nonparametric estimation of the entropy for absolutely continuous distributions, IEEE Trans. Inform. Theory, vol.36, pp.688-692, 1989.

C. Andrieu and J. Thoms, A tutorial on adaptive MCMC, Stat. Comput, vol.18, pp.343-373, 2008.

Y. Atchadé, Approximate spectral gaps for Markov chains mixing times in highdimension, 2019.

Y. Bai, R. V. Craiu, D. Narzo, and A. F. , Divide and conquer: A mixture-based approach to regional adaptation for mcmc, J. Comp. Graph. Stat, pp.1-17, 2010.

Y. Bai, G. O. Roberts, and J. S. Rosenthal, On the containment condition for adaptive markov chain monte carlo algorithms, 2008.

J. Beirlant, E. J. Dudewicz, L. Györfi, and E. C. Van-der-meulen, Nonparametric entropy estimation, an overview, Int. J. Math. Stat. Sci, vol.6, pp.17-39, 1997.

T. B. Berrett, R. J. Samworth, and M. Yuan, Efficient multivariate entropy estimation via k-nearest neighbour distances, Ann. Statist, vol.47, pp.288-318, 2019.

A. Bulinski and D. Dimitrov, Statistical estimation of the shannon entropy, Acta Math. Sinica, vol.35, pp.17-46, 2019.

A. Charzy?ska and A. Gambin, Improvement of the k-NN entropy estimator with applications in systems biology, Entropy, vol.18, pp.1-19, 2015.

D. Chauveau and H. Alrachid, EntropyMCMC: An R Package for MCMC Simulation and Convergence Evaluation using Entropy and Kullback Divergence Estimation, 2019.

D. Chauveau and P. Vandekerkhove, Smoothness of Metropolis-Hastings algorithm and application to entropy estimation, ESAIM: Probability and Statistics, vol.17, pp.419-431, 2013.
URL : https://hal.archives-ouvertes.fr/hal-00803167

D. Chauveau and P. Vandekerkhove, Simulation based nearest neighbor entropy estimation for (adaptive) MCMC evaluation, JSM Proceedings, Statistical Computing Section, pp.2816-2827, 2014.
URL : https://hal.archives-ouvertes.fr/hal-00879399

R. Douc, A. Guillin, J. Marin, R. , and C. , Convergence of adaptive mixtures of importance sampling schemes, Ann. Statist, vol.35, issue.1, pp.420-448, 2007.
URL : https://hal.archives-ouvertes.fr/hal-00432955

B. Efron and C. Morris, Data analysis using Stein's estimator and its generalizations, J. Amer. Stat. Assoc, vol.70, issue.350, pp.311-319, 1975.

P. P. Eggermont and V. N. Lariccia, Best asymptotic normality of the kernel density entropy estimator for smooth densities, IEEE trans. Inform. Theory, vol.45, issue.4, pp.1321-1326, 1999.

G. Fort, E. Moulines, P. Priouret, and P. Vandekerkhove, A central limit theorem for adaptive and interacting markov chains, Bernoulli, vol.20, pp.457-485, 2014.
URL : https://hal.archives-ouvertes.fr/hal-00608569

S. Geman and D. Geman, Stochastic relaxation, Gibbs distributions and the bayesian restoration of images, IEEE Trans. Pattern Anal. Mach. Intell, vol.6, pp.721-741, 1984.

L. Györfi and E. C. Van-der-meulen, An entropy estimate based on a kernel density estimation, Limit Theorems in Probability and Statistics Pécs, vol.57, pp.229-240, 1989.

H. Haario, E. Saksman, and J. Tamminen, An adaptive Metropolis algorithm, Bernoulli, vol.7, issue.2, pp.223-242, 2001.

P. Harremoes and K. K. Holst, Convergence of Markov chains in information divergence, 2007.

W. Hastings, , 1970.

, Monte Carlo sampling methods using Markov chains and their applications, Biometrika, vol.57, pp.97-109

L. Holden, Geometric convergence of the Metropolis-Hastings simulation algorithm, Statistics and Probabilitiy Letters, p.39, 1998.

L. Kozachenko and N. N. Leonenko, Sample estimate of entropy of a random vector, Problems of Information Transmission, vol.23, pp.95-101, 1987.

F. Maire and P. Vandekerkhove, On Markov chain Monte Carlo for sparse and filamentary distributions, 2018.

N. Metropolis, A. Rosenbluth, M. Rosenbluth, A. Teller, and E. Teller, Equations of state calculations by fast computing machines, J. Chem. Phys, vol.21, pp.1087-1092, 1953.

. R-core-team, R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, 2018.

G. Roberts and J. Rosenthal, Optimal scaling for various Metropolis-Hastings algorithms, Statistical Science, vol.16, pp.351-367, 2001.

G. O. Roberts and J. S. Rosenthal, Examples of Adaptive MCMC, Journal of Computational and Graphical Statistics, vol.18, issue.2, pp.349-367, 2009.

J. Rosenthal, Analysis of the Gibbs sampler for a model related to James-Stein estimators, Stat. and Comp, vol.6, pp.269-275, 1996.

J. S. Rosenthal and G. O. Roberts, Coupling and ergodicity of adaptive mcmc, J. Appl. Prob, vol.44, pp.458-475, 2007.

H. Singh, N. Misra, V. Hnizdo, A. Fedorowicz, and E. Demchuk, Nearest neighbor estimate of entropy, American Journal of Mathematical and Management Sciences, vol.23, issue.3, pp.301-321, 2003.

K. Sricharan, D. Wei, I. Hero, and A. O. , Ensemble estimators for multivariate entropy estimation, 2013.

D. Stowell and M. D. Plumbley, Fast multidimensional entropy estimation by k-d partitioning, IEEE Signal Processing Letters, vol.16, issue.6, pp.537-540, 2009.

M. Thompson, SamplerCompare: A framework for comparing the performance of MCMC samplers, 2010.

J. A. Vrugt, C. Braak, C. Diks, B. A. Robinson, J. M. Hyman et al., Accelerating markov chain monte carlo simulation by differential evolution with self-adaptive randomized subspace sampling, International Journal of Nonlinear Sciences & Numerical Simulation, vol.10, issue.3, pp.271-288, 2009.

Q. Wang, S. R. Kulkarni, and S. Verdú, A nearest-neighbor approach to estimating divergence between continuous random vectors, 2006.

Q. Wang, S. R. Kulkarni, and S. Verdú, Divergence estimation for multidimensional densities via k-nearest-neighbor distances, IEEE Transactions on Information Theory, vol.55, issue.5, pp.2392-2405, 2009.

J. Yang and J. S. Rosenthal, Complexity results for MCMC derived from quantitative bounds, 2019.