Hostname: page-component-cd9895bd7-gxg78 Total loading time: 0 Render date: 2024-12-23T00:20:15.118Z Has data issue: false hasContentIssue false

On transience of $\mathrm{M}/\mathrm{G}/\infty$ queues

Published online by Cambridge University Press:  10 October 2024

Serguei Popov*
Affiliation:
University of Porto
*
*Postal address: Centro de Matemática, University of Porto, Rua do Campo Alegre 687, 4169–007 Porto, Portugal. Email address: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

We consider an $\mathrm{M}/\mathrm{G}/\infty$ queue with infinite expected service time. We then provide the transience/recurrence classification of the states (the system is said to be at state n if there are n customers being served), observing also that here (unlike irreducible Markov chains, for example) it is possible for recurrent and transient states to coexist. We also prove a lower bound on the growth speed in the transient case.

MSC classification

Type
Original Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution, and reproduction in any medium, provided the original work is properly cited.
Copyright
© The Author(s), 2024. Published by Cambridge University Press on behalf of Applied Probability Trust

In this note we consider a classical $\mathrm{M}/\mathrm{G}/\infty$ queue (see e.g. [Reference Newell3]): the customers arrive according to a Poisson process with rate $\lambda$ ; upon arrival, a customer immediately enters the service, and the service times are i.i.d. (non-negative) random variables with some general distribution. For notational convenience, let S be a generic random variable with that distribution. We also assume that at time 0 there are no customers being served. Let $Y_t$ denote the number of customers in the system at time t, which we also refer to as the state of the system at time t; note that, in general, Y is not a Markov process. Nevertheless, let us still call a state m recurrent if the set $\{t\,\colon\, Y_t=m\}$ is a.s. unbounded (i.e. m is ‘visited infinitely many times’) and transient if that set is a.s. bounded (i.e. the system escapes from m eventually). Let us also observe that $\mathbb{P}[S=\infty]>0$ clearly implies that Y goes to infinity (indeed, it is straightforward to obtain that $\liminf_{t\to\infty}({{Y_t}/{t}})\geq \lambda\mathbb{P}[S=\infty]$ ), so from now on we assume that $S<\infty$ a.s.

We are mainly interested in the situation where the system is unstable, i.e. when $\mathbb{E} S = \infty$ . In this situation, in principle, our (Markovian) intuition tells us that the system can be transient (in the sense $Y_t\to \infty$ a.s.) or recurrent (i.e. all states are visited infinitely many times a.s.). However, it turns out that for this model the complete picture is more complicated.

Theorem 1. Define

(1) \begin{equation} k_0 = \min\biggl\{k\in\mathbb{Z}_+ \,\colon\, \int_0^\infty (\mathbb{E}(S\wedge t))^k \text{exp}({-}\lambda \mathbb{E}(S\wedge t))\, {\mathrm{d}} t = \infty \biggr\}\end{equation}

(with the convention $\min\emptyset=+\infty$ ). Then

\begin{equation*} \liminf_{t\to\infty}Y_t = k_0 \quad \textit{a.s.}\end{equation*}

In particular, if

(2) \begin{equation} \int_0^\infty (\mathbb{E}(S\wedge t))^k \text{exp}({-}\lambda \mathbb{E}(S\wedge t))\, {\mathrm{d}} t < \infty\quad \textit{for all $k\geq 0$,}\end{equation}

then the system is transient; if

\begin{equation*} \int_0^\infty \text{exp}({-}\lambda \mathbb{E}(S\wedge t))\, {\mathrm{d}} t = \infty,\end{equation*}

then the system is recurrent.

Before proving this result, we make the following remark. Let us define M(t) to be the maximal remaining service time of the customers that are present at time t. This is a so-called extremal shot noise process; see [Reference Foucart and Y.1] and references therein. It is not difficult to obtain that transience of $M({\cdot})$ is the same as transience of state 0 in $\mathrm{M}/\mathrm{G}/\infty$ ; then Theorem 2.5 of [Reference Foucart and Y.1] provides a criterion for the transience of $M({\cdot})$ (and therefore for the transience of state 0 in our situation).

Proof of Theorem 1. We start with a simple observation: for any $j\geq 0$ , $\{\liminf Y_t =j\}$ is a tail event, so it has probability 0 or 1. This implies that $\liminf Y_t$ is a.s. a constant (which may be equal to $+\infty$ ).

We use the following representation of the process (see Figure 1): consider a Poisson process in $\mathbb{R}_+^2$ , with the intensity measure $\lambda\, {\mathrm{d}} t \times {\mathrm{d}} F_S(u)$ , where $F_S(u)=\mathbb{P}[S\leq u]$ is the distribution function of S. Then, a point (t, u) of this Poisson process is interpreted in the following way: a customer arrived at time t and the duration of its service will be u. Now, draw a (dotted) line in the southeast direction from each point, as shown in the picture; as long as this line stays in $\mathbb{R}_+^2$ , the corresponding customer is present in the system. If we draw a vertical line from (t, 0) in the upwards direction, then the number of dotted lines it intersects is equal to $Y_t$ .

Figure 1. A Poisson representation of $\mathrm{M}/\mathrm{G}/\infty$ . In this example, there are exactly three customers at time t.

Next, for $k\in\mathbb{Z}_+$ let

\[ T_k:= \{t\geq 0\,\colon\, Y_t=k\}\]

denote the set of times when the system has exactly k customers, and let

\[ U_{t} = \bigl\{(s,u)\in\mathbb{R}_+^2\,\colon\, s\in[0,t], u\geq t-s \bigr\}.\]

We note that $Y_t$ equals the number of points in $U_{t}$ , which has Poisson distribution with mean

\[ \int_{U_{t}}\lambda \, {\mathrm{d}} t\, {\mathrm{d}} F_S(u) = \lambda \mathbb{E}(S\wedge t).\]

Therefore, by Fubini’s theorem, we have (here $|A|$ stands for the Lebesgue measure of $A\subset \mathbb{R}$ )

(3) \begin{equation}\mathbb{E} |T_k| = \mathbb{E} \int_0^{\infty} {\mathbf{1}\mkern -1.5mu}{\{Y_t=k\}}\, {\mathrm{d}} t= \dfrac{\lambda^k}{k!}\int_0^{\infty} (\mathbb{E}(S\wedge t))^k\text{exp}({-}\lambda\mathbb{E}(S\wedge t))\, {\mathrm{d}} t.\end{equation}

Now, assume that $\mathbb{E}|T_k|<\infty$ for some $k\geq 0$ ; it automatically implies that $\mathbb{E}|T_\ell|<\infty$ for $0\leq\ell\leq k$ . This means that $|T_0|,\ldots,|T_k|$ are a.s. finite, and let us show that $T_0,\ldots,T_k$ have to be a.s. bounded (this is a small technical issue that we have to resolve because we are considering continuous time). Probably the cleanest way to see this is as follows. First notice that, in fact, $T_0$ is a union of intervals of random i.i.d. (with Exp( $\lambda$ ) distribution) lengths, because each time the system becomes empty it will remain so until the arrival of the next customer. Therefore, $|T_0|<\infty$ clearly means that $\sup T_0\leq K_0$ for some (random) $K_0$ . Now, after $K_0$ there are no longer any $1\to 0$ transitions, so the remaining part of $T_1$ again becomes a union of such intervals, meaning that it should be bounded as well; we then repeat this reasoning a suitable number of times to finally obtain that $T_k$ must be a.s. bounded. This implies that $\liminf_{t\to \infty} Y_t \geq k_0$ a.s.

Next, assume that $\{0,\ldots,k\}$ is a transient set, in the sense that $\liminf_{t\to \infty} Y_t \geq k+1$ a.s.; let us show that this implies that $\mathbb{E}|T_k|<\infty$ . Indeed, first we can choose a sufficiently large $h>0$ in such a way that

\[ \mathbb{P}[Y_t\geq k+1 \text{ for all }t\geq h]\geq \dfrac{1}{2}.\]

Define a stopping time $\tau=\inf\{t\geq h \,\colon\, Y_t\leq k\}$ (again, with the convention $\inf\emptyset = +\infty$ ). Then, a crucial observation is that what one sees after $\tau$ is a superposition of two independent systems: one is formed by those customers (with their remaining lifetimes) present at $\tau$ , and the other is a copy of the original system. Then, a simple coin-tossing argument, together with the fact that an initially non-empty system (i.e. with some customers being served, with any assumptions on their remaining service times) dominates an initially empty system, shows that $|T_k|$ (in fact, $|T_0|+\cdots+|T_k|$ ) is dominated by $h\times\mathrm{Geom}_0 ( {\tfrac{1}{2}} )$ random variable and therefore has a finite expectation. It means that we have $\liminf_{t\to \infty} Y_t \leq k_0$ a.s. (because otherwise, in the situation when $k_0<\infty$ , we would have $\mathbb{E}|T_{k_0}|<\infty$ , which, by definition, is not the case). This concludes the proof of Theorem 1.

Regarding this result, we may observe that in most situations one would have $k_0=0$ or $+\infty$ ; this is because convergence of such integrals is usually determined by what is in the exponent. Still, it is not difficult to construct ‘strange examples’ with $0<k_0<\infty$ , i.e. where the process will visit $\{0,\ldots, k_0-1\}$ only finitely many times, but will hit every $k\geq k_0$ infinitely often a.s. (a behaviour one cannot have with irreducible Markov chains). For instance, let $\lambda=1$ and fix $b>0$ ; next, consider a service time distribution such that

\[1-F_S(u)=\frac{1}{u}+\frac{b}{u\ln u}\]

for large enough u. Then it is elementary to obtain that $\mathbb{E}(S\wedge t) = \ln t + b \ln\ln t + {\mathrm{O}}(1)$ and the integrals in (1) diverge whenever $k\geq b-1$ , meaning that $k_0 = \lceil b \rceil-1$ .

Now, in the situation when (2) holds and Y is transient, it may also be useful to be able to say something about the speed of convergence of $Y_t$ to infinity. We do not intend to enter deeply into this question here, but only prove a particular result needed for future reference. Namely, in [Reference Menshikov, Popov and Wade2] we work with a different model which in some sense dominates $\mathrm{M}/\mathrm{G}/\infty$ . So, we will now give a lower bound on the growth of $Y_t$ ; more specifically, we will show that under certain conditions $Y_t$ will eventually be at least a constant fraction of its expected value. For $q\in(0,1)$ , let us define $\gamma_q = 1 - q - q\ln q^{-1} > 0$ .

Theorem 2. Fix $q\in(0,1)$ and assume that

(4) \begin{equation}\int_0^\infty \text{exp}({-}\gamma_q\lambda \mathbb{E}(S\wedge t))\, {\mathrm{d}} t < \infty.\end{equation}

Then

(5) $\mathbb{P}[{Y_t} \ge q\lambda (S \wedge t)\;for\,all\,large\,enough\,t] = 1.$

Proof. Let

\[ H_q = \{t\geq 0\,\colon\, Y_t < q\lambda \mathbb{E}(S\wedge t) \};\]

our goal is to show that $H_q$ is a.s. bounded in the case when (4) holds. We recall a standard (Chernoff) tail bound: if X is Poisson( $\mu$ ) and $q\in(0,1)$ , then

(6) \begin{equation} \mathbb{P}[X\leq q\mu]\leq \text{exp}({-}(q\mu\ln q + \mu-q\mu))= \text{exp}({-}\gamma_q \mu).\end{equation}

Then, analogously to (3), we obtain from (6) that

(7) \begin{equation} \mathbb{E} |H_q| \leq \int_0^\infty \text{exp}({-}\gamma_q\lambda \mathbb{E}(S\wedge t))\, {\mathrm{d}} t;\end{equation}

so, by (4), we have $\mathbb{E}|H_q|<\infty$ , meaning that $|H_q|<\infty$ a.s. To see that this has to imply that $H_q$ is a.s. bounded, analogously to the proof of Theorem 1, one can reason in the following way. If $t\in H_q$ , then $s\in H_q$ for all $s\in (t, A_t)$ , where $A_t$ is the first moment after t when a customer arrives in the system. This implies that the lengths of the intervals that constitute $H_q$ dominate a sequence of i.i.d. random variables with Exp( $\lambda$ ) distribution; in its turn, this clearly implies that if $|H_q|$ is finite then it has to be bounded.

Funding information

There are no funding bodies to thank relating to the creation of this article.

Competing interests

There were no competing interests to declare which arose during the preparation or publication process of this article.

References

Foucart, C. and Y., Linglong (2023). Extremal shot noise processes and random cutout sets. Available at arXiv:2302.03082 (to appear in Bernoulli).Google Scholar
Menshikov, M., Popov, S. and Wade, A. (2024). Semi-infinite particle systems with exclusion interaction and heterogeneous jump rates. Available at arXiv:2405.05246.Google Scholar
Newell, G. F. (1966). The $\mathrm{M}/\mathrm{G}/\infty$ queue. SIAM J. Appl. Math. 14, 8688.CrossRefGoogle Scholar
Figure 0

Figure 1. A Poisson representation of $\mathrm{M}/\mathrm{G}/\infty$. In this example, there are exactly three customers at time t.