Hostname: page-component-cd9895bd7-dk4vv Total loading time: 0 Render date: 2024-12-27T04:35:41.116Z Has data issue: false hasContentIssue false

Reversibility of Markov chains with applications to storage models

Published online by Cambridge University Press:  14 July 2016

Hideo Ōsawa*
Affiliation:
University of Electro-Communications, Tokyo
*
Postal address: Department of Mathematical Statistics, The University of Electro-Communications, 1–5–1 Chofugaoka, Chofu-shi, Tokyo 182, Japan.

Abstract

This paper studies the reversibility conditions of stationary Markov chains (discrete-time Markov processes) with general state space. In particular, we investigate the Markov chains having atomic points in the state space. Such processes are often seen in storage models, for example waiting time in a queue, insurance risk reserve, dam content and so on. The necessary and sufficient conditions for reversibility of these processes are obtained. Further, we apply these conditions to some storage models and present some interesting results for single-server queues and a finite insurance risk model.

Type
Research Papers
Copyright
Copyright © Applied Probability Trust 1985 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Kawata, T. (1972) Fourier Analysis in Probability Theory. Academic Press, New York.Google Scholar
Kelly, F. P. (1979) Reversibility and Stochastic Networks. Wiley, New York.Google Scholar
Kobayashi, H. (1983) Stochastic modeling: queueing models. In Probability Theory and Computer Science, ed. Louchard, G. and Latouche, G. Academic Press, New York.Google Scholar
Lindley, D. V. (1952) The theory of queues with a single server. Proc. Camb. Phil. Soc. 48, 277289.Google Scholar
Prabhu, N. U. (1980) Stochastic Storage Processes. Springer-Verlag, New York.Google Scholar
Suomela, P. (1979) Invariant measures of time-reversible Markov chains. J. Appl. Prob. 16, 226229.CrossRefGoogle Scholar