(d) Using the hint given in the problem statement we can nd the mean of N4 by summing up
the means of the 4 identically distributed geometric random variables each with mean 4. This
gives E[N4 ] = 4E[N1 ] = 16.
Problem 6.1.4 Solution
We can solve this pro

Problem Solutions Chapter 7
Problem 7.1.1 Solution
Recall that X 1 , X 2 . . . X n are independent exponential random variables with mean value X = 5
so that for x 0, FX (x) = 1 ex/5 .
2
2
2
(a) Using Theorem 7.1, Mn (x) = X /n. Realizing that X = 25, we

The resulting plot will be essentially identical to Figure 6.4. One nal note, the command
set(h,LineWidth,0.25) is used to make the bars of the PMF thin enough to be resolved
individually.
Problem 6.9.5 Solution
Since the mdgrid function extends naturally

In the graphs we will see that as n increases, the Erlang PDF becomes increasingly similar to
the Gaussian PDF of the same expected value and variance. This is not surprising since the Erlang
(n, ) random variable is the sum of n of exponential random var

function pb=binomialcdftest(N);
pb=zeros(1,N);
for n=1:N,
w=[0.499 0.501]*10n;
w(1)=ceil(w(1)-1;
pb(n)=diff(binomialcdf(10n,0.5,w);
end
Unfortunately, on this users machine (a Windows XP laptop), the program fails for N = 4. The
problem, as noted earlier

Problem 6.8.4 Solution
This problem is solved completely in the solution to Quiz 6.8! We repeat that solution here. Since
W = X 1 + X 2 + X 3 is an Erlang (n = 3, = 1/2) random variable, Theorem 3.11 says that for
any w > 0, the CDF of W satises
2
FW (w)

Equivalently, for p = 0.8, solving the quadratic equation
500
p
n
2
= (1.29)2
1 p
n
p
(4)
we obtain n = 641.3. Thus we should test n = 642 circuits.
Problem 6.8.1 Solution
The N [0, 1] random variable Z has MGF Z (s) = es
P [Z c] min esc e
2 /2
s 2 /2
s0

(c) Each Yn is the difference of two samples of X that are independent of the samples used by
any other Ym . Thus Y1 , Y2 , . . . is an iid random sequence. By Theorem 7.1, the mean and
variance of Mn (Y ) are
E [Mn (Y )] = E [Yn ] = 0
Var[Yn ]
2 Var[X ]

Problem 7.2.4 Solution
Let X 1 denote the number of rolls until the rst occurrence of snake eyes. Similarly, let X i denote
the number of additional rolls for the ith occurrence. Since each roll is snake eyes with probability
p = 1/36, X 1 , X 2 and X 3 a

Problem 7.4.2 Solution
X 1 , X 2 , . . . are iid random variables each with mean 75 and standard deviation 15.
(a) We would like to nd the value of n such that
P [74 Mn (X ) 76] = 0.99
(1)
When we know only the mean and variance of X i , our only real too

With some more algebra, we obtain
2a 2
2a 2
(n 1)
a + a 2 + + a n1
1a
1a
2
2a 2
a
n(1 + a) 2
2 2
(1 a n1 )
1a
1a
1a
Var[X 1 + + X n ] = n 2 +
=
(5)
(6)
Since a/(1 a) and 1 a n1 are both nonnegative,
1+a
1a
Var[X 1 + + X n ] n 2
(7)
(b) Since the expecte

It follows for n n 0 that
P
Rn r
2
2
2
Rn E Rn
P
+
2
/2
Rn E Rn
=P
2
2
2
/2
(3)
By the Chebyshev inequality, we have that
2
Rn E Rn
P
2
/2
Var[ Rn ]
( / 2)2
(4)
Combining these facts, we see for n n 0 that
P
2
Rn r
2
It follows that
lim P
n
Rn r
2
2
Var

Problem 7.3.5 Solution
Note that we can write Yk as
X 2k1 X 2k
2
Yk =
2
+
2
X 2k X 2k1
2
=
(X 2k X 2k1 )2
2
(1)
Hence,
E [Yk ] =
1
2
2
E X 2k 2X 2k X 2k1 + X 2k1 = E X 2 (E [X ])2 = Var[X ]
2
(2)
Next we observe that Y1 , Y2 , . . . is an iid random seque

we observe that
P max Ri, j (n) E X i X j
i, j
c = P i, j Ai, j .
(3)
Applying the Chebyshev inequality to Ri, j (n), we nd that
P Ai, j
Var[ Ri, j (n)]
Var[X i X j ]
=
.
c2
nc2
(4)
By the union bound,
P max Ri, j (n) E X i X j
i, j
c
P Ai, j
i, j
1
n

(a) By the Markov inequality,
P [R 250]
E [R]
54
=
= 0.432.
250
125
(2)
(b) By the Chebyshev inequality,
P [R 250] = P [R 108 142] = P [|R 108| 142]
Var[R]
= 0.1875.
(142)2
(3)
(4)
249
(c) The exact value is P[R 250] = 1 r =3 PR (r ). Since there is no w

Evaluation of these integrals depends on v through the function
1 v1<v <1
0 otherwise
f Y3 (v y) =
(4)
To compute the convolution, it is helpful to depict the three distinct cases. In each case, the square
pulse is f Y3 (v y) and the trianglular pulse is

(c)
P [X 1 9] =
10
f X 1 (x) d x =
9
(1/6) d x = 1/6
(4)
9
(d) The variance of M16 (X ) is much less than Var[X 1 ]. Hence, the PDF of M16 (X ) should be
much more concentrated about E[X ] than the PDF of X 1 . Thus we should expect P[M16 (X ) > 9]
to be

(b) For C = 328.6, the exact probability of overload is
P [N > C] = 1 P [N 328] = 1 poissoncdf(300,328) = 0.0516,
(4)
which shows the central limit theorem approximation is reasonable.
(c) This part of the problem could be stated more carefully. Re-examin

Comment: Perhaps a more interesting question is why the overload probability in a one-second
interval is so much higher than that in a one-minute interval? To answer this, consider a T -second
interval in which the number of requests NT is a Poisson (T )

Problem 6.2.4 Solution
In this problem, X and Y have joint PDF
8x y 0 y x 1
0
otherwise
f X,Y (x, y) =
(1)
We can nd the PDF of W using Theorem 6.4: f W (w) = f X,Y (x, w x) d x. The only tricky
part remaining is to determine the limits of the integration

Problem 6.3.2 Solution
(a) By summing across the rows of the table, we see that J has PMF
0.6
0.4
PJ ( j) =
j = 2
j = 1
(1)
The MGF of J is J (s) = E[es J ] = 0.6e2s + 0.4es .
(b) Summing down the columns of the table, we see that K has PMF
0.7 k = 1
0.2

Problem 6.2.6 Solution
The random variables K and J have PMFs
PJ ( j) =
j e
j!
j = 0, 1, 2, . . .
otherwise
0
k e
k!
PK (k) =
k = 0, 1, 2, . . .
otherwise
0
(1)
For n 0, we can nd the PMF of N = J + K via
P [J = n k, K = k]
P [N = n] =
(2)
k=
Since J an

We can use these results to derive two well known results. We observe that we can directly use the
PMF PK (k) to calculate the moments
E [K ] =
1
n
n
E K2 =
k
k=1
1
n
n
k2
(12)
k=1
Using the answers we found for E[K ] and E[K 2 ], we have the formulas
n
k

Similarly, the third moment of Y is
E Y 3 = E (X + )3
(6)
= E X + 3X + 3 X +
3
2
2
3
= 3 + .
2
3
(7)
Finally, the fourth moment of Y is
E Y 4 = E (X + )4
(8)
= E X + 4X + 6 X + 4 X +
4
3
2
2
3
4
= 3 4 + 62 2 + 4 .
(9)
(10)
Problem 6.3.5 Solution
The PMF

Now to nd the rst moment, we evaluate the derivative of X (s) at s = 0.
E [X ] =
d X (s)
ds
=
s=0
s bebs aeas ebs eas
(b a)s 2
(2)
s=0
Direct evaluation of the above expression at s = 0 yields 0/0 so we must apply lH pitals rule and
o
differentiate the nu

Problem 6.2.2 Solution
The joint PDF of X and Y is
1 0 x, y 1
0 otherwise
f X,Y (x, y) =
(1)
Proceeding as in Problem 6.2.1, we must rst nd FW (w) by integrating over the square dened by
0 x, y 1. Again we are forced to nd FW (w) in parts as we did in Pro

Problem 6.1.5 Solution
Since each X i has zero mean, the mean of Yn is
E [Yn ] = E [X n + X n1 + X n2 ] /3 = 0
(1)
Since Yn has zero mean, the variance of Yn is
2
Var[Yn ] = E Yn
1
= E (X n + X n1 + X n2 )2
9
1
2
2
2
= E X n + X n1 + X n2 + 2X n X n1 + 2X

Problem 6.4.4 Solution
Based on the problem statement, the number of points X i that you earn for game i has PMF
1/3 x = 0, 1, 2
0
otherwise
PX i (x) =
(a) The MGF of X i is
(1)
X i (s) = E es X i = 1/3 + es /3 + e2s /3
(2)
Since Y = X 1 + + X n , Theore