This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ns
form independent Poisson processes. This ability to view independent Poisson processes
either independently or as a splitting of a combined process is a powerful technique for
ﬁnding almost trivial solutions to many problems.
It was next shown that a nonhomogeneous Poisson process could be viewed as a (homogeneous) Poisson process on a nonlinear time scale. This allows all the properties of
(homogeneous) Poisson properties to be applied directly to the nonhomogeneous case. The
simplest and most useful result from this is (2.31), showing that the number of arrivals in
any interval has a Poisson PMF. This result was used to show that the number of customers in service at any given time τ in an M/G/1 queue has a Poisson PMF with a mean
approaching ∏ times the expected service time in the limit as τ → 1.
Finally we looked at the distribution of arrivals conditional on n arrivals in the interval
(0, t]. It was found that these arrivals had the same joint distribution as the order statistics
of n uniform IID rv’s in (0, t]. By using symmetry and going back and forth between
the uniform variables and the Poisson process arrivals, we found the distribution of the
interarrival times, the arrival epochs, and various conditional distributions. 2.7 Exercises Exercise 2.1. a) Find the Erlang density fSn (t) by convolving fX (x) = ∏ exp(−∏x) with
itself n times.
b) Find the moment generating function of X (or ﬁnd the Laplace transform of fX (x)),
and use this to ﬁnd the moment generating function (or Laplace transform) of Sn = X1 +
X2 + · · · + Xn . Invert your result to ﬁnd fSn (t).
c) Find the Erlang density by starting with (2.14) and then calculating the marginal density
for Sn .
Exercise 2.2. a) Find the mean, variance, and moment generating function of N (t), as
given by (2.15).
b) Show by discrete convolution that the sum of two independent Poisson rv’s is again
Poisson.
c) Show by using the properties of the Poisson process that the sum of two independent
Poisson rv’s must be Poisson.
Exercise 2.3. The purpose of this exercise is to give an alternate derivation of the Poisson
distribution for N (t), the number of arrivals in a Poisson process up to time t; let ∏ be the
rate of the process.
a) Find the conditional probability Pr {N (t) = n  Sn = τ } for all τ ≤ t.
b) Using the Erlang density for Sn , use (a) to ﬁnd Pr {N (t) = n}. 2.7. EXERCISES 83 Exercise 2.4. Assume that a counting process {N (t); t≥0} has the independent and stationary increment properties and satisﬁes (2.15) (for all t > 0). Let X1 be the epoch of the
ﬁrst arrival and Xn be the interarrival time between the n − 1st and the nth arrival.
a) Show that Pr {X1 > x} = e−∏x .
b) Let Sn−1 be the epoch of the n − 1st arrival. Show that Pr {Xn > x  Sn−1 = τ } = e−∏x .
c) For each n > 1, show that Pr {Xn > x} = e−∏x and that Xn is independent of Sn−1 .
d) Argue that Xn is independent of X1 , X2 , . . . Xn−1 .
Exercise 2.5. The point of this exercise is to show that the sequence of PMF’s for the
counting process of a Bernoulli process does not specify the process. In other words, knowing
that N (t) satisﬁes the binomial distribution for all t does not mean that the process is
Bernoulli. This helps us understand why the second deﬁnition of a Poisson process requires
stationary and independent increments as well as the Poisson distribution for N (t).
a) For a sequence of binary rv’s Y1 , Y2 , Y3 , . . . , in which each rv is 0 or 1 with equal
probability,° ﬁnd a joint distribution for Y1 , Y2 , Y3 that satisﬁes the binomial distribution,
t¢
pN (t) (k) = k 2−k for t = 1, 2, 3 and 0 ≤ k ≤, but for which Y1 , Y2 , Y3 are not independent.
Your solution should contain four 3tuples with probability 1/8 each, two 3tuples with
probability 1/4 each, and two 3tuples with probability 0. Note that by making the subsequent arrivals IID and equiprobable, you have an example where N (t) is binomial for all
t but the process is not Bernoulli. Hint: Use the binomial for t = 3 to ﬁnd two 3tuples
that must have probability 1/8. Combine this with the binomial for t = 2 to ﬁnd two other
3tuples with probability 1/8. Finally look at the constraints imposed by the binomial
distribution on the remaining four 3tuples.
b) Generalize part a) to the case where Y1 , Y2 , Y3 satisfy Pr {Yi = 1} = q and Pr {Yi = 0} =
1 − q . Assume q < 1/2 and ﬁnd a joint distribution on Y1 , Y2 , Y3 that satisﬁes the binomial
distribution, but for which the 3tuple (0, 1, 1) has zero probability.
c) More generally yet, view a joint PMF on binary ttuples as a ° ¢
nonnegative vector in a 2t
dimensional vector space. Each binomial probability pN (τ ) (k) = τ q k (1 − q )τ −k constitutes
k
a linear constraint on this vector. For each τ , show that one of these constraints may be
replaced by the constraint that the components of the vector sum to 1.
d) Using part c), show that at...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details