This preview shows page 1. Sign up to view the full content.
Unformatted text preview: h(x) + h(t) for all x, t ≥ 0. These two statements (see Exercise 2.6) imply
that h(x) must be linear in x, and Pr {X > x} must be exponential in x.
Although the exponential distribution is the only memoryless distribution, it is interesting
to note that if we restrict the deﬁnition of memoryless to integer times, then the geometric
distribution is memoryless, so the Bernoulli process in this respect seems like a discretetime
version of the Poisson process.
We now use the memoryless property of the exponential rv to ﬁnd the distribution of
the ﬁrst arrival in a Poisson process after some given time t > 0. We not only ﬁnd this
distribution, but also show that this ﬁrst arrival after t is independent of all arrivals up
to and including t. Note that t is an arbitrarily selected constant here; it is not a random
variable. Let Z be the duration of the interval from t until the ﬁrst arrival after t. First we
ﬁnd Pr {Z > z  N (t) = 0} . 62 CHAPTER 2. POISSON PROCESSES ✛
✛ ✛ X2 ✲
✲
✲ Z X1 0 t S1 S2 Figure 2.2: For some ﬁxed t, consider the event N (t) = 0. Conditional on this event,
Z is the interval from t to S1 ; i.e., Z = X1 − t. As illustrated in Figure 2.2, for {N (t) = 0}, the ﬁrst arrival after t is the ﬁrst arrival of the
process. Stating this more precisely, the following events are identical:3
{Z > z } \
\
{N (t) = 0} = {X1 > z + t} {N (t) = 0}. The conditional probabilities are then Pr {Z > z  N (t)=0} = Pr {X1 > z + t  N (t)=0}
= Pr {X1 > z + t  X1 > t}
−∏z = Pr {X1 > z } = e . (2.5)
(2.6) In (2.5), we used the fact that {N (t) = 0} = {X1 > t}, which is clear from Figure 2.1. In
(2.6) we used the memoryless condition in (2.4) and the fact that X1 is exponential.
Next consider the condition that there are n arrivals in (0, t] and the nth occurs at epoch
Sn = τ ≤ t. The argument here is basically the same as that with N (t) = 0, with a few
extra details (see Figure 2.3).
✛ X3 ✲ N (t) ✛ X2 ✲ ✻
✛ Z ✲ ✛ X1 ✲ 0 S1 τ
S2 t
S3 Figure 2.3: Given N (t) = 2, and S2 = τ , X3 is equal to Z + (t − τ ). Also, the event
{N (t)=2, S2 =τ } is the same as the event {S2 =τ , X3 >t−τ }. Conditional on N (t) = n and Sn = τ , the ﬁrst arrival after t is the ﬁrst arrival after the
arrival at Sn , i.e., Z = z corresponds to Xn+1 = z + t − τ . Stating this precisly, the following
events are identical:
\
\
\
\
{Z > z } {N (t) = n} {Sn = τ } = {Xn+1 > z +t−τ } {N (t) = n} {Sn = τ }.
3 It helps intuition to sometimes think of one event A as conditional on another event B . More precisely,
T
A given B is the set of sample points in B that are also in A, which is simply A B . 2.2. DEFINITION AND PROPERTIES OF THE POISSON PROCESS 63 Note that Sn = τ is an event of zero probability, but Sn is a sum of n IID random variables
with densities, and thus has a density itself, so that other events can be conditioned on it.
Pr {Z > z  N (t)=n, Sn =τ } = Pr {Xn+1 > z +t−τ  N (t)=n, Sn =τ } = Pr {Xn+1 > z +t−τ  Xn+1 >t−τ , Sn =τ }
= Pr {Xn+1 > z +t−τ  Xn+1 >t−τ }
−∏z = Pr {Xn+1 > z } = e . (2.7)
(2.8)
(2.9)
(2.10) In (2.8), we have used the fact that, given Sn = τ , the event N (t) = n is the same as
Xn+1 > t − τ (see Figure 2.3). In (2.9) we used the fact that Xn+1 is independent of Sn .
In (2.10) we used the memoryless condition in (2.4) and the fact that Xn+1 is exponential.
The same argument applies if, in (2.7), we condition not only on Sn but also on S1 , . . . , Sn−1 .
Since this is equivalent to conditioning on N (τ ) for all τ in (0, t], we have
Pr {Z > z  {N (τ ), 0 < τ ≤ t}} = exp(−∏z ). (2.11) The following theorem states this in words.
Theorem 2.1. For a Poisson process of rate ∏, and any given time t > 0, the interval
from t until the ﬁrst arrival after t is a nonnegative rv Z with the distribution function
1 − exp[−∏z ] for z ≥ 0. This rv is independent of al l arrival epochs before time t and
independent of N (τ ) for al l τ ≤ t.
The length of our derivation of (2.11) somewhat hides its conceptual simplicity. Z , conditional on the time τ of the last arrival before t, is simply the remaining time until the
next arrival, which, by the memoryless property, is independent of τ ≤ t, and hence also
independent of everything before t.
Next consider subsequent interarrival intervals after a given time t. For m ≥ 2, let Zm be
the interarrival interval from the m − 1st arrival epoch after t to the mth arrival epoch after
t. Given N (t) = n, we see that Zm = Xm+n , and therefore Z2 , Z3 , . . . , are IID exponentially
distributed random variables, conditional on N (t) = n (see Exercise 2.8). Let Z in (2.11)
become Z1 here. Since Z1 is independent of Z2 , Z3 , . . . and independent of N (t), we see
that Z1 , Z2 , . . . are unconditionally IID and also independent of N (t). It should also be
clear that Z1 , Z2 , . . . are independent of {N (τ ); 0 < τ ≤ t}.
The...
View
Full
Document
This note was uploaded on 09/27/2010 for the course EE 229 taught by Professor R.srikant during the Spring '09 term at University of Illinois, Urbana Champaign.
 Spring '09
 R.Srikant

Click to edit the document details