This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ty then becomes5
fS1 ···Sn (s1 , . . . , sn ) = ∏n exp(−∏sn ) for 0 ≤ s1 ≤ s2 · · · ≤ sn . (2.14) The interpretation here is the same as with S2 . The joint density does not contain any
arrival time other than sn , except for the ordering constraint 0 ≤ s1 ≤ s2 ≤ · · · ≤ sn , and
thus this joint density is the same for all choices of arrival times satisfying the ordering
constraint. Mechanically integrating this over s1 , then s2 , etc. we get the Erlang formula
(2.12). The Erlang density then is the joint density in (2.14) times the volume sn−1 /(n−1)!
n
of the region of s1 , . . . , sn−1 satisﬁng 0 < s1 < · · · < sn . This will be discussed further
later. 2.2.3 The PMF for N (t) The Poisson counting process, {N (t); t > 0} consists of a discrete rv N (t) for each t > 0. In
this section, we show that the PMF for this rv is the wellknown Poisson PMF, as stated in
the following theorem. We give two proofs for the theorem, each providing its own type of
understanding and each showing the close relationship between {N (t) = n} and {Sn = t}.
Theorem 2.3. For a Poisson process of rate ∏, and for any t > 0, the PMF for N (t) (i.e.,
the number of arrivals in (0, t]) is given by the Poisson PMF,
pN (t) (n) = (∏t)n exp(−∏t)
.
n! (2.15) Proof 1: This proof, for given n and t, is based on two ways of calculating the probability
Pr {t < Sn+1 ≤ t + δ } for some vanishingly small δ . The ﬁrst way is based on the already
5
The random vector S = (S1 , . . . , Sn ) is then related to the interarrival intervals X = (X1 , . . . , Xn ) by
a linear transformation, say S = AX . In general, the joint density of S at s = Ax is fS (s ) = fX (x )/ det A.
This is because the transformation A carries a cube δ on a side into a parallelepiped of volume δ n  det A. In
the case here, A is upper triangular with 1’s on the diagonal, so det A = 1. 66 CHAPTER 2. POISSON PROCESSES known density of Sn+1 and gives
Pr {t < Sn+1 ≤ t + δ } = Z t+δ t fSn (τ ) dτ = fSn (t) (δ + o(δ )). The term o(δ ) is used to describe a function of δ that goes to 0 faster than δ as δ →
0. More precisely, a function g (δ ) is said to be of order o(δ ) if limδ→0 g(δδ) = 0. Thus
Pr {t < Sn ≤ t + δ } = fSn (t)(δ + o(δ )) is simply a consequence of the fact that Sn has a
continuous probability density in the interval [t, t + δ ].
The second way is that {t < Sn+1 ≤ t + δ } occurs if exactly n arrivals arrive in the interval
(0, t] and one arrival occurs in (t, t + δ ]. Because of the independent increment property,
this is an event of probability pN (t) (n)(∏δ + o(δ )). It is also possible to have fewer than n
arrivals in (0, t] and more than one in (t, t + δ ], but this has probability o(δ ). Thus
pN (t) (n)(∏δ + o(δ )) + o(δ ) = fSn+1 (t)(δ + o(δ )).
Dividing by ∏ and taking the limit δ → 0, we get
∏pN (t) (n) = fSn+1 (t).
Using the density for fSn given in (2.12), we get (2.15).
Proof 2: The approach here is to use the fundamental relation that {N (t ≥ n} = {Sn ≤ t}.
Taking the probabilities of these events,
1
X
i=n pN (t) (i) = Z 0 t fSn (τ ) dτ for all n ≥ 1 and t > 0. The term on the right above is the distribution function of Sn for each n ≥ 1 and the term
on the left is the complementary distribution function of N (t) for each t > 0. Thus this
equation (for all n ≥ 1, t > 0), uniquely speciﬁes the PMF of N (t) for each t > 0. The
theorem will then be proven by showing that
X1 (∏t)i exp(−∏t) Z t
=
fSn (τ ) dτ .
i=n
i!
0 (2.16) If we take the derivative with respect to t of each side of (2.16), we ﬁnd that almost magically
each term except the ﬁrst on the left cancels out, leaving us with
∏n tn−1 exp(−∏t)
= fSn (t).
(n − 1)!
Thus the derivative with respect to t of each side of (2.16) is equal to the derivative of the
other for all n ≥ 1 and t > 0. The two sides of (2.16) are also equal in the limit t → 0, so
it follows that (2.16) is satisﬁed everywhere, completing the proof. 2.2. DEFINITION AND PROPERTIES OF THE POISSON PROCESS 2.2.4 67 Alternate deﬁnitions of Poisson processes Deﬁnition 2 of a Poisson process: A Poisson counting process {N (t); t ≥ 0} is a
counting process that satisﬁes (2.15) (i.e., has the Poisson PMF) and has the independent
and stationary increment properties.
We have seen that the properties in Deﬁnition 2 are satisﬁed starting with Deﬁnition 1
(using IID exponential interarrival times), so Deﬁnition 1 implies Deﬁnition 2. Exercise
2.4 shows that IID exponential interarrival times are implied by Deﬁnition 2, so the two
deﬁnitions are equivalent.
It may be somewhat surprising at ﬁrst to realize that a counting process that has the
Poisson PMF at each t is not necessarily a Poisson process, and that the independent and
stationary increment properties are also necessary. One way to see this is to recall that
the Poisson PMF for all t in a count...
View
Full
Document
 Spring '09
 R.Srikant

Click to edit the document details