Hostname: page-component-cd9895bd7-gxg78 Total loading time: 0 Render date: 2024-12-23T17:02:29.606Z Has data issue: false hasContentIssue false

Markov Chains in Many Dimensions

Published online by Cambridge University Press:  01 July 2016

Dimitris N. Politis*
Affiliation:
Purdue University
*
* Postal address: Department of Statistics, Purdue University, W. Lafayette, IN 47907, USA.

Abstract

A generalization of the notion of a stationary Markov chain in more than one dimension is proposed, and is found to be a special class of homogeneous Markov random fields. Stationary Markov chains in many dimensions are shown to possess a maximum entropy property, analogous to the corresponding property for Markov chains in one dimension. In addition, a representation of Markov chains in many dimensions is provided, together with a method for their generation that converges to their stationary distribution.

Type
General Applied Probability
Copyright
Copyright © Applied Probability Trust 1994 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

[1] Abend, K., Harley, T. J. and Kanal, L. N. (1965) Classification of binary random patterns. IEEE Trans. Inf. Theory 11, 538544.CrossRefGoogle Scholar
[2] Anastassiou, D. and Sakrison, D. J. (1982) Some results regarding the entropy rate of random fields. IEEE Trans. Inf Theory 28, 340343.CrossRefGoogle Scholar
[3] Besag, J. (1974) Spatial interaction and the statistical analysis of lattice systems. J. R. Statist. Soc. B 36, 192236.Google Scholar
[4] Choi, B. S. and Cover, T. M. (1984) An information-theoretic proof of Burg's maximum entropy spectrum. Proc. IEEE 72, 10941095.CrossRefGoogle Scholar
[5] Cover, T. M. and Thomas, J. (1991) Elements of Information Theory. Wiley, New York.Google Scholar
[6] Dobrushin, R. L. (1968) The description of a random field by means of conditional probabilities and conditions of its regularity. Theory Prob. Appl. 13, 197224.CrossRefGoogle Scholar
[7] Doob, J. L. (1953) Stochastic Processes. Wiley, New York.Google Scholar
[8] Föllmer, H. (1973) On entropy and information gain in random fields. Z. Wahrscheinlichkeitsth. 26, 207217.CrossRefGoogle Scholar
[9] Georgii, H.-O. (1988) Gibbs Measures and Phase Transitions. Studies in Mathematics 9, DeGruyter, Berlin.CrossRefGoogle Scholar
[10] Goutsias, J. (1991) Unilateral approximation of Gibbs random field images, CVGIP: Graphical Models and Image Processing, 53, 240257.Google Scholar
[11] Helson, H., and Lowdenslager, D. (1958) Prediction theory and Fourier series in several variables. Acta Math., 99, 165202.CrossRefGoogle Scholar
[12] Helson, H., and Lowdenslager, D. (1961) Prediction theory and Fourier series in several variables II. Acta Math., 102, 175213.CrossRefGoogle Scholar
[13] Kindermann, R. and Snell, J. L. (1980) Markov Random Fields and their Applications. Contemporary Mathematics Vol. 1, American Mathematical Society, Providence, Rhode Island.CrossRefGoogle Scholar
[14] Pinsker, M. S. (1964) Information and Information Stability of Random Variables and Processes. Holden-Day, San Francisco.Google Scholar
[15] Politis, D. N. (1992) Moving average processes and maximum entropy. IEEE Trans. Inf. Theory 38, 11741177.CrossRefGoogle Scholar
[16] Politis, D. N. (1993) Non-parametric maximum entropy. IEEE Trans. Inf. Theory 39, 14091413.CrossRefGoogle Scholar
[17] Spitzer, F. (1972) A variational characterization of finite Markov chains. Ann. Math. Statist. 43, 303307.CrossRefGoogle Scholar
[18] Tjøstheim, D. (1978) Statistical spatial series modelling. Adv. Appl. Prob. 10, 130154.CrossRefGoogle Scholar
[19] Tjøstheim, D. (1983) Statistical spatial series modelling II: some further results on unilateral lattice processes. Adv. Appl. Prob. 15, 562584; Correction 16 (1984), 220.CrossRefGoogle Scholar
[20] Whittle, P. (1954) On stationary processes in the plane. Biometrika 44, 434449.CrossRefGoogle Scholar