This preview shows page 1. Sign up to view the full content.
Unformatted text preview: Density for the order statistics of an IID 2 dimensional uniform distribution. Note that the square over which fU (2) is nonzero contains one triangle where
u2 > u1 and another of equal size where u1 > u2 . Each of these maps, by a permutation
mapping, into the single triangle where s2 > s1 . Next we want to ﬁnd the marginal distribution functions of the individual Si conditional on
N (t) = n. Starting with S1 , and viewing it as the minimum of the IID uniformly distributed
variables U1 , . . . , Un , we recognize that S1 > τ if and only if Ui > τ for all i, 1 ≤ i ≤ n.
Thus,
∑
∏
t−τ n
Pr {S1 > τ  N (t)=n} =
for 0 < τ ≤ t.
(2.37)
t
Since this is the complement of the distribution function of S1 , conditional on N (t) = n,
we can integrate it to get the conditional mean of S1 ,
E [S1  N (t)=n] = t
.
n+1 (2.38) We come back later to the distribution functions of S2 , . . . , Sn , and ﬁrst look at the marginal
distributions of the interarrival intervals. Recall from (2.36) that
fX (n) N (t) (x (n)  n) = n!
tn for X (n) > 0, Xn i=1 Xi < t. (2.39) The joint density is the same for all points in the constraint region, and the constraint does
not distinguish between X1 to Xn . Thus they must all have the same marginal distribution,
and more generally the marginal distribution of any subset of the Xi can depend only on
the size of the subset. We have found the distribution of S1 , which is the same as X1 , and
thus
∑
∏
t−τ n
Pr {Xi > τ  N (t)=n} =
for 1 ≤ i ≤ n and 0 < τ ≤ t.
(2.40)
t
t
E [Xi  N (t)=n] =
for 1 ≤ i ≤ n.
(2.41)
n+1
∗
Next deﬁne Xn+1 = t − Sn to be the interval from the largest of the IID variables to t, the
right end of the interval. Using (2.39) fX (n) N (t) (x (n)  n) = n!
tn ∗
for X (n) > 0, Xn+1 > 0, Xn i=1 ∗
Xi + Xn+1 = t. 2.6. SUMMARY 81 ∗
The constraints above are symmetric in X1 , . . . , Xn , Xn+1 , and the density of X1 , . . . , Xn
within the constrainty region is uniform. This density can be replaced by a density over any
∗
other n rv’s out of X1 , . . . , Xn , Xn+1 by a linear transformation with unit determinant. Thus
∗
Xn+1 has the same marginal distribution as each of the Xi . This gives us a partial check on
∗
our work, since the interval (0, t] is divided into n + 1 intervals of sizes X1 , X2 , . . . , Xn , Xn+1 ,
and each of these has a mean size t/(n + 1). We also see that the joint distribution function
∗
of any proper subset of X1 , X2 , . . . Xn , Xn+1 is independent of the order of the variables. Next consider the distribution function of Xi+1 for (i < n), conditional both on N (t) = n
and Si = si (or conditional on any given values for X1 , . . . , Xi summing to si ). We see
that Xi+1 is just the wait until the ﬁrst arrival in the interval (si , t], given that this interval
contains n − i arrivals. From the same argument as used in (2.37), we have
∑
∏
t − si − τ n−i
Pr {Xi+1 > τ  N (t)=n, Si =si } =
.
(2.42)
t − si
Since Si+1 is Xi+1 + Si , this immediately gives us the conditional distribution of Si+1
∑
∏
t − si+1 n−i
Pr {Si+1 > si+1  N (t) = n, Si = si } =
.
(2.43)
t − si
We note that this is independent of S1 , . . . , Si−1 . As a check, one can ﬁnd the conditional
densities from (2.43) and multiply them all together to get back to (2.34) (see Exercise
2.25).
W can also ﬁnd the distribution of each Si conditioned on N (t) = n but unconditioned on
S1 , S2 , . . . , Si−1 . The density for this is calculated by looking at n uniformly distributed
rv’s in (0, t]. The probability that one of these lies in the interval (x, x + dt] is (n dt)/t. Out
of the remaining n − 1, the probability that i − 1 lie in the interval (0, x] is given by the
binomial distribution with probability of success x/t. Thus the desired density is
fSi (x  N (t)=n) dt =
fSi (x  N (t)=n) = 2.6 xi−1 (t − x)n−i (n − 1)!
tn−1 (n − i)!(i − 1)!
xi−1 (t − x)n−i n!
.
tn (n − i)!(i − 1)! n dt
t
(2.44) Summary We started the chapter with three equivalent deﬁnitions of a Poisson process—ﬁrst as a
renewal process with exponentially distributed interrenewal intervals, second as a stationary and independent increment counting process with Poisson distributed arrivals in each
interval, and third essentially as a limit of shrinking Bernoulli processes. We saw that each
deﬁnition provided its own insights into the properties of the process. We emphasized the
importance of the memoryless property of the exponential distribution, both as a useful
tool in problem solving and as an underlying reason why the Poisson process is so simple.
We next showed that the sum of independent Poisson processes is again a Poisson process.
We also showed that if the arrivals in a Poisson process are independently routed to diﬀerent 82 CHAPTER 2. POISSON PROCESSES locations with some ﬁxed probability assignment, then the arrivals at each of these locatio...
View Full
Document
 Spring '09
 R.Srikant

Click to edit the document details