P. Amblard and C. Vignat, A note on bounded entropies, Physica A: Statistical Mechanics and its Applications, vol.365, issue.1, pp.50-56, 2006.
DOI : 10.1016/j.physa.2006.01.002

URL : https://hal.archives-ouvertes.fr/hal-00136333

P. Billingsley, Statistical Inference for Markov Processes The university of, 1961.

J. Bourdon, M. E. Nebel, and B. Vallée, On the Stack-Size of General Tries, RAIRO - Theoretical Informatics and Applications, vol.35, issue.2, pp.163-185, 2001.
DOI : 10.1051/ita:2001114

URL : https://hal.archives-ouvertes.fr/hal-00442422

D. Chauveau and P. Vandekerkhove, A Monte Carlo Estimation of the Entropy for Markov Chains, Methodology and Computing in Applied Probability, vol.23, issue.1, pp.133-149, 2007.
DOI : 10.1007/s11009-006-9010-6

URL : https://hal.archives-ouvertes.fr/hal-00140942

F. Chazal and V. Maume-deschamps, Statistical properties of General Markov dynamical sources: applications to information theory Discrete Math, Theor. Comp. Sc. vol, vol.6, issue.2, pp.283-314, 2004.

G. Ciuperca and V. Girardin, Estimation of the Entropy Rate of a Countable Markov Chain, Communications in Statistics - Theory and Methods, vol.27, issue.14, pp.2543-2557, 2007.
DOI : 10.1137/1118080

L. Cover, T. , and J. , Elements of information theory. Wiley series in telecommunications, 1991.

S. Furuichi, Information theoretical properties of Tsaliis entropies, J. Math. Physics, vol.47, 2006.

V. Girardin, R. Baeza-yates, J. Glaz, and H. , On the Different Extensions of the Ergodic Theorem of Information Theory, Recent Advances in Applied Probability, pp.163-179, 2005.
DOI : 10.1007/0-387-23394-6_7

V. Girardin and A. Sesboüé, Comparative Construction of Plug-in Estimators of the Entropy Rate of Two-state Markov Chains, Methodology and Computing in Applied Probability, vol.23, issue.14, pp.181-200, 2009.
DOI : 10.1007/s11009-008-9106-2

L. Golshani, E. Pasha, Y. , and G. , Some properties of R??nyi entropy and R??nyi entropy rate, Information Sciences, vol.179, issue.14, pp.2426-2433, 2009.
DOI : 10.1016/j.ins.2009.03.002

P. Harremoës, Interpretations of Rényi Entropies and Divergences Physica A vol, pp.57-62, 2006.

T. Kato, Perturbation Theory for Linear Operators, 1976.
DOI : 10.1007/978-3-662-12678-3

M. L. Menéndez, D. Morales, L. Pardo, and M. Salicrú, (h, ??)-entropy differential metric, Applications of Mathematics, vol.42, issue.2, pp.81-98, 1997.
DOI : 10.1023/A:1022214326758

P. Rao and B. L. , Maximum Likelihood Estimation for Markov Process, Ann. Inst. Stat. Math, vol.24, pp.333-345, 1972.

Z. Rached, Renyi's Entropy for Discrete Markov Sources, Master of Science Project, 1998.

Z. Rached, F. Alajaji, and L. L. Campbell, Rényi's Entropy Rate for Discrete Markov Sources, Proc. CISS, pp.613-618, 1999.

A. Rényi, On measures of information and entropy, Proc. 4th Berkeley Symposium on Mathematics, Statistics and Probability, pp.547-561, 1960.

M. Salicrú, M. L. Menéndez, D. Morales, and L. Pardo, Asymptotic distribution of (h, ??)-entropies, Communications in Statistics - Theory and Methods, vol.10, issue.7, pp.2015-2031, 1993.
DOI : 10.1080/03610928508828905

C. Shannon, A Mathematical Theory of Communication, Bell System Technical Journal, vol.27, issue.3, pp.379-423, 1948.
DOI : 10.1002/j.1538-7305.1948.tb01338.x

B. D. Sharma and P. Mittal, New non-additive measures of relative information, J. Comb. Inform. and Syst. Sci, vol.2, pp.122-133, 1975.

C. Tsallis, Possible generalization of Boltzmann-Gibbs statistics, Journal of Statistical Physics, vol.8, issue.1-2, pp.479-487, 1988.
DOI : 10.1007/BF01016429

D. Vere-jones, Ergodic properties of nonnegative matrices. I and II Pacific, J. Math. vol, vol.2226, issue.3, pp.361-386, 1967.

M. P. Wachowiak, R. Smolikova, G. D. Tourassi, and A. S. Elmaghray, Estimation of generalized entropies with sample spacing, Pattern Anal, Applic, vol.8, pp.95-101, 2005.