Hostname: page-component-78c5997874-g7gxr Total loading time: 0 Render date: 2024-11-18T02:22:19.632Z Has data issue: false hasContentIssue false

STOCHASTIC DISCRETIZATION FOR THE LONG-RUN AVERAGE REWARD IN FLUID MODELS

Published online by Cambridge University Press:  27 February 2003

I.J.B.F. Adan
Affiliation:
Department of Mathematics and Computer Science, Eindhoven University of Technology, Eindhoven, The Netherlands, E-mail: [email protected]
J.A.C. Resing
Affiliation:
Department of Mathematics and Computer Science, Eindhoven University of Technology, Eindhoven, The Netherlands, E-mail: [email protected]
V.G. Kulkarni
Affiliation:
Department of Operations Research, University of North Carolina, Chapel Hill, NC 27599, E-mail: [email protected]

Abstract

Stochastic discretization is a technique of representing a continuous random variable as a random sum of i.i.d. exponential random variables. In this article, we apply this technique to study the limiting behavior of a stochastic fluid model. Specifically, we consider an infinite-capacity fluid buffer, where the net input of fluid is regulated by a finite-state irreducible continuous-time Markov chain. Most long-run performance characteristics for such a fluid system can be expressed as the long-run average reward for a suitably chosen reward structure. In this article, we use stochastic discretization of the fluid content process to efficiently determine the long-run average reward. This method transforms the continuous-state Markov process describing the fluid model into a discrete-state quasi-birth–death process. Hence, standard tools, such as the matrix-geometric approach, become available for the analysis of the fluid buffer. To demonstrate this approach, we analyze the output of a buffer processing fluid from K sources on a first-come first-served basis.

Type
Research Article
Copyright
© 2003 Cambridge University Press

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)