Discrete-time stochastic processes

Thus the interarrival times and the arrival epochs

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ing process is equivalent to the Erlang density for the successive arrival epochs. Specifying the probability density for S1 , S2 , . . . , as Erlang specifies the marginal densities of S1 , S2 , . . . ,, but need not specify the joint densities of these rv’s. Figure 2.4 illustrates this in terms of the joint density of S1 , S2 , given as fS1 S2 (s1 s2 ) = ∏2 exp(−∏s2 ) for 0 ≤ s1 ≤ s2 and 0 elsewhere. The figure illustrates how the joint density can be changed without changing the marginals. ✑ ✑ ✑ ✑ s1 ✑ ✑ ✑ ✑ ✑ ✑ 0 fS1 S2 (s1 s2 ) > 0 s2 Figure 2.4: The joint density of S1 , S2 is nonzero in the region shown. It can be changed, while holding the marginals constant, by reducing the joint density by ε in the upper left and lower right squares above and increasing it by ε in the upper right and lower left squares. There is a similar effect with the Bernoulli process in that a discrete counting process for which the number of arrivals from 0 to t, for each integer t is a binomial rv, but the process is not Bernoulli. This is explored in Exercise 2.5. The next definition of a Poisson process is based on its incremental properties. Consider e the number of arrivals in some very small interval (t, t + δ ]. Since N (t, t + δ ) has the same 68 CHAPTER 2. POISSON PROCESSES distribution as N (δ ), we can use (2.15) to get n o e Pr N (t, t + δ ) = 0 = e−∏δ ≈ 1 − ∏δ + o(δ ) n o e Pr N (t, t + δ ) = 1 = ∏e−∏δ ≈ ∏δ + o(δ ) n o e Pr N (t, t + δ ) ≥ 2 ≈ o(δ ). (2.17) Definition 3 of a Poisson process: A Poisson counting process is a counting process that satisfies (2.17) and has the stationary and independent increment properties. We have seen that Definition 1 implies Definition 3. The essence of the argument the other way is that for any interarrival interval X , FX (x + δ ) − FX (x) is the probability of an arrival in an appropriate infinitesimal interval of width δ , which by (2.17) is ∏δ + o(δ ). Turning this into a differential equation (see Exercise 2.7), we get the desired exponential interarrival intervals. Definition 3 has an intuitive appeal, since it is based on the idea of independent arrivals during arbitrary disjoint intervals. It has the disadvantage that one must do a considerable amount of work to be sure that these conditions are mutually consistent, and probably the easiest way is to start with Definition 1 and derive these properties. Showing that there is a unique process that satisfies the conditions of Definition 3 is even harder, but is not necessary at this point, since all we need is the use of these properties. Section 2.2.5 will illustrate better how to use this definition (or more precisely, how to use (2.17)). What (2.17) accomplishes, beyond the assumption of independent and stationary increments, in Definition 3 is the prevention of bulk arrivals. For example, consider a counting process in which arrivals always occur in pairs, and the intervals between successive pairs are IID and exponentially distributed with parameter ∏ (see Figure 2.5). For this process, n o n o e e Pr N (t, t + δ )=1 = 0, and Pr N (t, t+δ )=2 = ∏δ + o(δ ), thus violating (2.17). This process has stationary and independent increments, however, since the process formed by viewing a pair of arrivals as a single incident is a Poisson process. N (t) 4 ✛ 2 ✛ 0 X1 X2 ✲3 ✲1 S1 S2 Figure 2.5: A counting process modeling bulk arrivals. X1 is the time until the first pair of arrivals and X2 is the interval between the first and second pair of arrivals. 2.2. DEFINITION AND PROPERTIES OF THE POISSON PROCESS 2.2.5 69 The Poisson process as a limit of shrinking Bernoulli processes The intuition of Definition 3 can be achieved in a much less abstract way by starting with the Bernoulli process, which has the properties of Definition 3 in a discrete-time sense. We then go to an appropriate limit of a sequence of these processes, and find that this sequence of Bernoulli processes converges in various ways to the Poisson process. Recall that a Bernoulli process is an IID sequence, Y1 , Y2 , . . . , of binary random variables for which pY (1) = q and pY (0) = 1 − q . We can visualize Yi = 1 as an arrival at time i and Yi = 0 as no arrival, but we can also ‘shrink’ the time scale of the process so that for some integer j > 0, Yi is an arrival or no arrival at time i2−j . We consider a sequence indexed by j of such shrinking Bernoulli processes, and in order to keep the arrival rate constant, we let q = ∏2−j for the j th process. Thus for each unit increase in j , the Bernoulli process shrinks by replacing each slot with two slots, each with half the previous arrival probability. The expected number of arrivals per time unit is then ∏, matching the Poisson process that we are approximating. If we look at this j th process relative to Definition 3 of a Poisson process, we see that for these regularly spaced increments o...
View Full Document

This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.

Ask a homework question - tutors are online