Hostname: page-component-cd9895bd7-gxg78 Total loading time: 0 Render date: 2024-12-23T17:27:15.708Z Has data issue: false hasContentIssue false

Markov population replacement processes

Published online by Cambridge University Press:  01 July 2016

Ioannis I. Gerontidis*
Affiliation:
University of Thessaloniki
*
* Postal address: Mathematics Department, University of Thessaloniki, 54006 Thessaloniki, Greece.

Abstract

We consider a migration process whose singleton process is a time-dependent Markov replacement process. For the singleton process, which may be treated as either open or closed, we study the limiting distribution, the distribution of the time to replacement and related quantities. For a replacement process in equilibrium we obtain a version of Little's law and we provide conditions for reversibility. For the resulting linear population process we characterize exponential ergodicity for two types of environmental behaviour, i.e. either convergent or cyclic, and finally for large population sizes a diffusion approximation analysis is provided.

Type
General Applied Probability
Copyright
Copyright © Applied Probability Trust 1995 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Arnold, L. (1974) Stochastic Differential Equations. Theory and Applications. New York, Wiley.Google Scholar
Barbour, A. D. (1974) On a functional limit theorem for Markov population processes. Adv. Appl. Prob. 6, 2139.Google Scholar
Bartholomew, D. J. (1977a) Maintaining a grade or age structure in stochastic environment. Adv. Appl. Prob. 9, 117.CrossRefGoogle Scholar
Bartholomew, D. J. (1977b) The analysis of data arising from stochastic processes. Chapter 5 in The Analysis of Survey Data, Vol. II Model Fitting , ed. O'Muircheartaigh, C. A. and Payne, C. pp. 145174 Wiley, Chichester.Google Scholar
Bartholomew, D. J. (1979) The control of grade structure in stochastic environment using promotion control. Adv. Appl. Prob. 11, 603615.Google Scholar
Bartholomew, D. J. (1982) Stochastic Models for Social Processes, 3rd edn. Wiley, Chichester.Google Scholar
Bartlett, M. S. (1949) Some evolutionary stochastic processes. J. R. Statist. Soc. B 11, 211229.Google Scholar
ÇInlar, E. (1975) Introduction to Stochastic Processes. Prentice-Hall, Englewood Cliffs, NJ.Google Scholar
Conlisk, J. (1976) Interactive Markov chains. J. Math. Sociol. 4, 157185.CrossRefGoogle Scholar
Darroch, J. N. and Seneta, E. (1967) On quasi-stationary distributions in absorbing continuous-time finite Markov chains. J. Appl. Prob. 4, 192196.Google Scholar
Gerontidis, I.I. (1990a) On certain aspects of non-homogeneous Markov systems in continuous time. J. Appl. Prob. 27, 530544.CrossRefGoogle Scholar
Gerontidis, I. I. (1990b) On the variance-covariance matrix in Markovian manpower systems in continuous time. Austral. J. Statist. 32, 271280.Google Scholar
Gerontidis, I. I. (1992) Cyclic strong ergodicity in non-homogeneous Markov systems. SIAM J. Matrix Anal. Appl. 13, 550566.Google Scholar
Gerontidis, I. I. (1993) A continuous time Markov-renewal replacement model for manpower systems. Appl. Stoch. Models Data Anal. 9, 3958.Google Scholar
Gerontidis, I. I. (1994a) Stochastic equilibria in non-homogeneous Markov population replacement processes. Math. Operat. Res. 19, 192210.CrossRefGoogle Scholar
Gerontidis, I. I. (1994b) Semi-Markov replacement chains. Adv. Appl. Prob. 26, 728755.Google Scholar
Gerontidis, I. I. (1994c) Periodic Markovian replacement chains. Stoc. Proc. Appl. 51, 307328.CrossRefGoogle Scholar
Iosifescu, M. (1980) Finite Markov Processes and their Applications. Wiley, New York.Google Scholar
Isaacson, D. and Luecke, G. R. (1978) Strongly ergodic Markov chains and rates of convergence using spectral conditions. Stoch. Proc. Appl. 7, 113121.CrossRefGoogle Scholar
Jackson, J. R. (1957) Networks of waiting lines. Operat. Res. 5, 518521.Google Scholar
Johnson, J. and Isaacson, D. (1988) Conditions for strong ergodicity using intensity matrices. J. Appl. Prob. 25, 3442.Google Scholar
Keilson, J. (1979) Markov Chain Models–Rarity and Exponentiality. Springer-Verlag, New York.CrossRefGoogle Scholar
Keilson, J. and Syski, R. (1974) Compensation measures in the theory of Markov chains. Stoch. Proc. Appl. 2, 5972.Google Scholar
Kelly, F. P. (1979) Reversibility in Stochastic Networks. Wiley, New York.Google Scholar
Kelly, F. P. (1991) Loss networks. Ann. Appl. Prob. 1, 319378.Google Scholar
Kingman, J. F. C. (1969) Markov population processes. J. Appl. Prob. 6, 118.Google Scholar
Lehoczky, J. P. (1980) Approximations for interactive Markov chains in discrete and continuous time. J. Math. Sociol. 7, 139157.Google Scholar
Lehoczky, J. P. and Gaver, D. P. (1977) A diffusion approximation analysis of a general n-compartment system. Math. Biosci. 36, 127148.Google Scholar
Mcclean, S. (1976) A continuous time population model with Poisson recruitment. J. Appl. Prob. 13, 348354.CrossRefGoogle Scholar
Mcclean, S. (1978) Continuous time stochastic models of a multigrade population, J. Appl. Prob. 15, 2632.Google Scholar
Mcneil, D. R. and Schach, S. (1973) Central limit analogues for Markov population processes. J. R. Statist. Soc. B 35, 123.Google Scholar
Mehlmann, A. (1977) Markovian manpower models in continuous time. J. Appl. Prob. 14, 249259.Google Scholar
Neuts, M. F. (1989) Structured Stochastic Matrices of M/G/1 Type and Their Applications. Marcel Dekker, New York.Google Scholar
Peace, M. C. Iii (1965) Matrix Methods of Algebra. Academic Press, New York.Google Scholar
Whittle, P. (1967) Nonlinear migration processes. Bull. Int. Inst. Statist. 42, 642647.Google Scholar
Whittle, P. (1968) Equilibrium distributions for an open migration process. J. Appl. Prob. 5, 567571.CrossRefGoogle Scholar
Whittle, P. (1986) Systems in Stochastic Equilibrium. Wiley, Chichester.Google Scholar
Young, A. and Almond, G. (1961) Predicting distributions of staff. Comp. J. 3, 246250.CrossRefGoogle Scholar