Hostname: page-component-78c5997874-mlc7c Total loading time: 0 Render date: 2024-11-09T05:37:21.928Z Has data issue: false hasContentIssue false

THE STRONG LIMIT THEOREM FOR RELATIVE ENTROPY DENSITY RATES BETWEEN TWO ASYMPTOTICALLY CIRCULAR MARKOV CHAINS

Published online by Cambridge University Press:  02 April 2018

Ying Tang
Affiliation:
Faculty of Science, Jiangsu University, Zhenjiang 212013, People's Republic of China and Shanghai Normal University, Tianhua College, Shanghai 201815, China
Weiguo Yang
Affiliation:
Faculty of Science, Jiangsu University, Zhenjiang 212013, People's Republic of China E-mail: [email protected]
Yue Zhang
Affiliation:
Faculty of Science, Jiangsu University, Zhenjiang 212013, People's Republic of China E-mail: [email protected]

Abstract

In this paper, we are going to study the strong limit theorem for the relative entropy density rates between two finite asymptotically circular Markov chains. Firstly, we prove some lammas on which the main result based. Then, we establish two strong limit theorem for non-homogeneous Markov chains. Finally, we obtain the main result of this paper. As corollaries, we get the strong limit theorem for the relative entropy density rates between two finite non-homogeneous Markov chains. We also prove that the relative entropy density rates between two finite non-homogeneous Markov chains are uniformly integrable under some conditions.

Type
Research Article
Copyright
Copyright © Cambridge University Press 2018 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Algoet, P.H. & Cover, T.M. (1988). A sandwich proof of the Shannon–McMillan–Breiman theorem. Annals of Probability 16: 899909.Google Scholar
2.Amari, S. (1985). Differential geometrical method in statistics. New York: Springer-Verlag.Google Scholar
3.Barron, A.R. (1985). The strong ergodic theorem for densities: Generalized Shannon-McMillan-Breiman theorem. Annals of Probability 13: 12921303.Google Scholar
4.Breiman, L. (1957). The individual ergodic theorem of information theory. The Annals of Mathematical Statistics 28: 809811.Google Scholar
5.Bowerman, B, David, H.T., & Isaacson, D. (1977). The convergence of Cesaro averages for certain nonstationary Markov chains. Stochastic Process and their Applications 5: 221230.Google Scholar
6.Chazottes, J.R., Giardina, C., & Redig, F. (2006). Relative entropy and waiting times for continuous time Markov processes. Electronic Journal of Probability 11: 10491068.Google Scholar
7.Chung, K.L. (1961). The ergodic theorem of information theory. The Annals of Mathematical Statistics 32: 612614.Google Scholar
8.Csiszar, I. (1967). Information type measures of difference of probability distribution and indirect observations. Studia Scientiarum Mathematicarum Hungarica 2: 299318.Google Scholar
9.Gray, R. (2011). Entropy and information theory. 2nd ed. New York: Springer.Google Scholar
10.Hall, P. & Heyde, C.C. (1980). Martingale limit theory and application. New York: Academic Press.Google Scholar
11.Isaacson, D. & Madsen, R. (1976). Markov chains theory and applications. New York: Wiley.Google Scholar
12.Jia, C., Chen, D., & Lin, K. (2008). The application of the relative entropy density divergence in intrusion detection models. 2008 International Conference on Computer Science and Software Engineering, 951954.Google Scholar
13.Kesidis, G. & Walrand, J. (1993). Relative entropy between Markov transition rate matrices. IEEE Transactions on Information Theory 39: 10561057.Google Scholar
14.Kieffer, J.C. (1974). A simple proof of the Moy-Perez generalization of the Shannon–McMillan–Breiman theorem. Pacific Journal of Mathematical 51: 203204.Google Scholar
15.Kullback, S. & Leibler, R. (1951). On information and sufficiency. The Annals of Mathematical Statistics 22: 7986.Google Scholar
16.Lai, J. & Ford, J.J. (2010). Relative entropy rate based Multiple Hidden Markov Model approximation. IEEE Transactions on Signal Processing 58(1): 165174.Google Scholar
17.Ma, H.L. & Yang, W.G. (2011). Erratum to ‘The asymptotic equipartition property for asymptotic circular Markov chains’. Probability in the Engineering and Informational Sciences 25: 265267.Google Scholar
18.McMillan, B. (1953). The basic theorems of information theory. The Annals of Mathematical Statistics 24: 196219.Google Scholar
19.Ross, S. (1982). Stochastic processes. New York: Wiley.Google Scholar
20.Shannon, C. (1948). A mathematical theory of communication. Bell Systtem Technical 27: 379423.Google Scholar
21.Wang, Z.Z. & Yang, W.G. (2016). The generalized entropy ergodic theorem for nonhomogeneous Markov chains. Journal of Theoretical Probability 29: 761775.Google Scholar
22.Yang, W.G. (1998). The asymptotic equipartition property for nonhomogeneous Markov information sources. Probability in the Engineering and Informational Sciences 12: 509518.Google Scholar
23.Yang, W.G. & Liu, W. (2004). The asymptotic equipartition property for mth-order nonhomogeneous Markov information sources. IEEE Transactions on Information Theory 50(12): 33263330.Google Scholar
24.Yang, J. et al. (2017). Strong law of large numbers for generalized sample relative entropy of nonhomogeneous Markov chains. Communication in Statistics – Theory and Methods 47(2): 15711579.Google Scholar
25.Yari, G.H. & Nikooravesh, Z. (2011). Relative entropy rate between a Markov chain and its corresponding hidden Markov chain. Journal of Statistical Research of Iran 8: 97109.Google Scholar
26.Zhong, P.P., Yang, W.G., & Liang, P.P. (2010). The asymptotic equipartition property for asymptotic circular Markov chains. Probability in the Engineering and Informational Sciences 24(2): 279288.Google Scholar