(b) We have using the chain rule
E
[
Y
] =
d
ds
M
Y
(
s
)
s
=0
=
d
ds
M
X
(
s
)
s
=0
·
λe
λ
(
M
X
(
s
)
−
1)
s
=0
=
1
2
·
λ
=
λ
2
,
where we have used the fact that
M
X
(0) = 1.
(c) From the law of iterated expectations we obtain
E
[
Y
] =
E E
[
Y

N
]
=
E
N
E
[
X
]
=
E
[
N
]
E
[
X
] =
λ
2
.
Solution to Problem 4.42.
Take
X
and
Y
to be normal with means 1 and 2,
respectively, and very small variances.
Consider the random variable that takes the
value of
X
with some probability
p
and the value of
Y
with probability 1
−
p
. This
random variable takes values near 1 and 2 with relatively high probability, but takes
values near its mean (which is 3
−
2
p
) with relatively low probability. Thus, this random
variable is not normal.
Now let
N
be a random variable taking only the values 1 and 2 with probabilities
p
and 1
−
p
, respectively. The sum of a number
N
of independent normal random variables
with mean equal to 1 and very small variance is a mixture of the type discussed above,
which is not normal.
Solution to Problem 4.43.
(a) Using the total probability theorem, we have
P
(
X >
4) =
4
k
=0
P
(
k
lights are red)
P
(
X >
4

k
lights are red)
.
We have
P
(
k
lights are red) =
4
k
1
2
4
.
The conditional PDF of
X
given that
k
lights are red, is normal with mean
k
minutes
and standard deviation (1
/
2)
√
k
. Thus,
X
is a mixture of normal random variables and
the transform associated with its (unconditional) PDF is the corresponding mixture
of the transforms associated with the (conditional) normal PDFs. However,
X
is not
normal, because a mixture of normal PDFs need not be normal.
The probability
P
(
X >
4

k
lights are red) can be computed from the normal tables for each
k
, and
P
(
X >
4) is obtained by substituting the results in the total probability formula above.
(b) Let
K
be the number of traﬃc lights that are found to be red. We can view
X
as
the sum of
K
independent normal random variables.
Thus the transform associated
with
X
can be found by replacing in the binomial transform
M
K
(
s
) = (1
/
2+(1
/
2)
e
s
)
4
the occurrence of
e
s
by the normal transform corresponding to
μ
= 1 and
σ
= 1
/
2.
Thus
M
X
(
s
) =
1
2
+
1
2
e
(1
/
2)
2
s
2
2
+
s
4
.
Note that by using the formula for the transform, we cannot easily obtain the proba
bility
P
(
X >
4).
61
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Solution to Problem 4.44.
(a) Using the random sum formulas, we have
E
[
N
] =
E
[
M
]
E
[
K
]
,
var(
N
) =
E
[
M
] var(
K
) +
(
E
[
K
]
)
2
var(
M
)
.
(b) Using the random sum formulas and the results of part (a), we have
E
[
Y
] =
E
[
N
]
E
[
X
] =
E
[
M
]
E
[
K
]
E
[
X
]
,
var(
Y
) =
E
[
N
] var(
X
) +
(
E
[
X
]
)
2
var(
N
)
=
E
[
M
]
E
[
K
] var(
X
) +
(
E
[
X
]
)
2
E
[
M
] var(
K
) +
(
E
[
K
]
)
2
var(
M
)
.
(c) Let
N
denote the total number of widgets in the crate, and let
X
i
denote the weight
of the
i
th widget. The total weight of the crate is
Y
=
X
1
+
· · ·
+
X
N
,
with
N
=
K
1
+
· · ·
+
K
M
,
so the framework of part (b) applies. We have
E
[
M
] =
1
p
,
var(
M
) =
1
−
p
p
2
,
(geometric formulas)
,
E
[
K
] =
μ,
var(
M
) =
μ,
(Poisson formulas)
,
E
[
X
] =
1
λ
,
var(
M
) =
1
λ
2
,
(exponential formulas)
.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '08
 staff
 Math, Chain Rule, The Chain Rule, Probability theory, Dave, Poisson PMF

Click to edit the document details