Hostname: page-component-cd9895bd7-dzt6s Total loading time: 0 Render date: 2024-12-28T14:48:17.760Z Has data issue: false hasContentIssue false

Sufficient conditions for regularity, recurrence and ergodicity of Markov processes

Published online by Cambridge University Press:  24 October 2008

Richard L. Tweedie
Affiliation:
The Australian National University, Canberra†

Abstract

Two sets of conditions are found on the Q-matrix of an irreducible Markov process on a countably infinite state space which ensure that Q is regular; the first set also implies that the Markov process is ergodic, the second that it is recurrent. If the process is not irreducible, the conditions still imply regularity, and then either non-dissipativity or ‘ultimate recurrence’ (reducible analogues of ergodicity and recurrence) of the process. Conditions sufficient for ergodicity or recurrence of the process in the non-regular and regular case are also given. The results parallel (and use) results for discrete time Markov chains, and the known discrete time recurrence condition is extended to the reducible case. The conditions are illustrated by a competition process example.

Type
Research Article
Copyright
Copyright © Cambridge Philosophical Society 1975

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

(1)Chung, K. L.Markov chains with stationary transition probabilities, 2nd Edition (Springer-Verlag, Berlin, 1967).Google Scholar
(2)Feller, W.An introduction to probability theory and its applications, Volume 1. (John Wiley and Sons, New York, 1950).Google Scholar
(3)Foster, F. G.Markoff chains with an enumerable number of states and a class of cascade processes. Proc. Cambridge Philos. Soc. 47 (1951), 7785.CrossRefGoogle Scholar
(4)Foster, F. G.On Markov chains with an enumerable infinity of states. Proc. Cambridge Philos. Soc. 48 (1952), 587591.CrossRefGoogle Scholar
(5)Foster, F. G.On the stochastic matrices associated with certain queueing processes. Ann. Math. Statist. 24 (1953), 355360.CrossRefGoogle Scholar
(6)Kendall, D. G.On non-dissipative Markoff chains with an enumerable infinity of states. Proc. Cambridge Philos. Soc. 47 (1951), 633634.CrossRefGoogle Scholar
(7)Mauldon, J. G.On non-dissipative Markov chains. Proc. Cambridge Philos. Soc. 53 (1958), 825835.CrossRefGoogle Scholar
(8)Miller, R. G. JrStationarity equations in continuous time Markov chains. Trans. Amer. Math. Soc. 109 (1963), 3544.Google Scholar
(9)Pakes, A. G.Some conditions for ergodicity and recurrence of Markov chains. Operations Res. 17 (1969), 10581061.CrossRefGoogle Scholar
(10)Reuter, G. E. H.Denumerable Markov processes and the associated contraction semigroups on l. Acta Math. 97 (1957), 146.CrossRefGoogle Scholar
(11)Reuter, G. E. H.Competition processes. Proc. Fourth Berkeley Symp. Math. Stats. Prob. 2 (1961), 421430.Google Scholar
(12)Tweedie, R. L.The calculation of limit probabilities for denumerable Markov processes from infinitesimal properties. J. Appl. Probability 10 (1973), 8499.CrossRefGoogle Scholar
(13)Tweedie, R. L.Some ergodic properties of the minimal Feller process. Quart. J. Math. (Oxford), (2) 25 (1974), 485495.CrossRefGoogle Scholar