Fast simulation of buffer overflows in queuing systems

1990 
Because of their rarity, the estimation of the statistics of buffer overflows in queuing systems via direct simulation is often very expensive in computer time. Past work on fast simulation using importance sampling has concentrated on systems with Poisson arrival processes and exponentially distributed service times. The authors demonstrate how, using large deviations theory and deterministic optimal control, an asymptotically optimal simulation system (in the sense of variance) can be generated for queues with a variety of arrival and service processes. In particular it is shown how to generate an optimal simulation system for a number of queues with deterministic service times. Such systems are of great practical interest because of their application of the modeling of asynchronous transfer mode switches. >
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    4
    References
    0
    Citations
    NaN
    KQI
    []