queueing

queueing - 1.1 Elements of Queueing Theory Queueing theory...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
1 1.1 Elements of Queueing Theory Queueing theory plays a key role in the modeling and analysis of telecommunication networks. Consider the buffer shown in figure 1. Data packets arrive and are buffered, ready to be read out on a transmission link at a rate of C bits/sec, as indicated. In the more general cases, the packets could be considered as different jobs or customers. A work-conserving queue is one in which packets must be transmitted or served, once admitted to the buffer, and in which the transmission link, the server in figure 1, is never idle so long as there is at least one packet waiting for transmission. It is clear that packets will have to be buffered or queued for service if the number of arriving at the input to the buffer in some interval of time is larger than the number the link can transmit in that time. The buffer will thus display alternating intervals when the queue is nonempty or the server is busy, and when the server is idle, with no packets either being transmitted or waiting to be transmitted. Arrivals are generally random or stochastic, so that the number of packets in the queue is stochastic as a function of time as well. Arriving packets Buffer Departing packets (C bits/sec) Average: λ packets/sec 1/ μ sec/packet Figure 1: Model of buffering process Queuing theory enables us to determine the statistics of the queue, from which such desired performance parameters as the time spent waiting in the queue or the probabilities a packet is blocked or lost, on arrival, may be found. The statistics in turn depend generally on three quantities: 1. The packet arrival process-the specific arrival statistics of the incoming packets 2. The packet length distribution-comparable to the customer service time distribution when discussing customer arrivals in the queuing literature 3. The number of servers and the service discipline-examples of service discipline include FIFO (first-in, first-serve) service and LIFO (last-in, first- out) A more common queueing representation, and the one we use, appears in figure 2. The focus in this figure is on the average arrival rate or load λ and capacity μ . The circle at the output of the queue is used to represent the queue service. A little thought will indicate that the ratio of load to capacity, λ / μ , should play a critical role in the study of queues. For clearly, as this ratio increases (that is, as the load increases with respect to the capacity), the queue should build up more and the busy intervals should occur more
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 often, while the probability of queueing delay and the probability of loss both increase. The ratio of load to capacity is called the link utilization and is generally given the special label ρ . We thus have: ρ = λ / μ We shall see that for very large buffers, approximated by infinite queues, a condition for stable operation of the queue is ρ <1. For realistic finite queues the queue length is found to increase rapidly in size, with congestion setting in, as
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 12

queueing - 1.1 Elements of Queueing Theory Queueing theory...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online