Hostname: page-component-78c5997874-4rdpn Total loading time: 0 Render date: 2024-11-06T10:37:24.955Z Has data issue: false hasContentIssue false

The Asymptotic Equipartition Property for a Nonhomogeneous Markov Information Source

Published online by Cambridge University Press:  27 July 2009

Weiguo Yang
Affiliation:
Hebei Mining and Civil Engineering Institute, Handan 056038, China

Abstract

In this paper, we study the asymptotic equipartition property (AEP) for a nonhomogeneous Markov information source. We first give a limit theorem for the averages of the functions of two variables of this information source by using the convergence theorem for the martingale difference sequence. As corollaries, we get several limit theorems and a limit theorem of the relative entropy density, which hold for any nonhomogeneous Markov information source. Then, we get a class of strong laws of large numbers for nonhomogeneous Markov information sources. Finally, we prove the AEP for a class of nonhomogeneous Markov information sources.

Type
Research Article
Copyright
Copyright © Cambridge University Press 1998

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1.Barron, A.R. (1985). The strong ergodic theorem for densities: Generalized Shannon–McMillan–Breiman theorem. Annals of Probability 13: 12921303.Google Scholar
2.Breiman, L. (1957). The individual ergodic theorem of information theory. Annals of Mathematical Statistics 28: 809811.CrossRefGoogle Scholar
3.Chow, Y.S. & Teicher, H. (1988). Probability Theory, 2nd ed.New York: Springer-Verlag.CrossRefGoogle Scholar
4.Chung, K.L. (1961). The ergodic theorem of information theory. Annals of Mathematical Statistics 32: 612614.CrossRefGoogle Scholar
5.Feinstein, A. (1954). A new basic theory of information theory. IRE Transactions on Information Theory 09: 220.Google Scholar
6.Kieffer, J.C. (1974). A simple proof of the Moy-Perez generalization of the Shannon–McMillan theorem. Pacific Journal of Mathematical 51: 203204.CrossRefGoogle Scholar
7.McMillan, B. (1953). The basic theorems of information theory. Annals of Mathematical Statistics 24: 196219.CrossRefGoogle Scholar
8.Shannon, C. (1948). A mathematical theory of communication. Bell System Technical Journal 27: 379423, 623–656.Google Scholar