Hostname: page-component-78c5997874-lj6df Total loading time: 0 Render date: 2024-11-18T02:23:02.132Z Has data issue: false hasContentIssue false

Simulating Average Delay–Variance Reduction by Conditioning

Published online by Cambridge University Press:  27 July 2009

Sheldon M. Ross
Affiliation:
Department of Industrial Engineering and Operations ResearchUniversity of California, Berkeley, California 94720

Abstract

An improved simulation estimator of the expected total delay of the first n customers in the systems GI/M/k and GI/G/I is obtained by replacing the (raw data) delay of customer i by its expected delay given the state of the system upon its arrival. It is improved in the sense that the sum of the first n such values has the same mean and smaller variance than the usual (raw) estimator. The only constraint on the arrival process is that it be independent of the process of service times.

Type
Articles
Copyright
Copyright © Cambridge University Press 1988

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Billingsley, P. (1986). Probability and measure, 2nd Ed. New York: John Wiley.Google Scholar
Bratley, P., Fox, B., & Schrage, L. (1983). A guide to simulation. New York: Springer-Verlag, pp. 6669.CrossRefGoogle Scholar
Ross, S. (1985). Introduction to probability models, 3rd Ed, Orlando: Academic Press, p. 475.Google Scholar