Math136/Stat219 Fall 2008
Midterm Examination
Friday, October 24, 2008, 11:00am  12:30pm
Write your name and sign the Honor code in the blue books provided.
You have 90 minutes to solve all questions, each worth points as marked (maximum of 50).
Complete reasoning is required for full credit.
You may cite lecture notes and homework
sets, as needed, stating precisely the result you use, why and how it applies.
You may consult the following materials while taking the exam:
1. Stat219/Math136 Lecture notes, Fall 2008 version (the required text)
2. Kevin Ross’s Lecture slides posted in Coursework (Fall 2008 only)
3. Homework problems, sample exams, and solutions posted in Coursework (Fall 2008
only)
4. Your own graded homework papers
5. Your own notes, handwritten or typed
Use of any other material is prohibited and constitutes a violation of the Honor
Code.
This includes, but is not limited to: other texts (including optional and recommended
texts), photocopying of texts or notes, materials from previous sections of Stat219/Math136,
the internet, programming formulas or other results in a calculator or computer, consultation
with anyone during the exam (except for the Teaching Assistants or the Instructor).
1.
(3 Points each) On a probability space (Ω
,
F
,
P
), let
Y
be a random variable with
E
(
Y
2
)
<
∞
and
G ⊆ F
be a
σ
field. Define
Var(
Y
G
) =
E
(
Y
2
G
)
−
(
E
(
Y
G
)
2
)
.
Show the following.
(Note: you must give a proof. Merely citing Exercise 2.3.7
will receive no credit.)
a)
Show that if
Y
is
G
measurable then Var(
Y
G
) = 0 almost surely.
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
ANS.
Since
Y
is
G
measurable,
Var(
Y
G
) =
E
(
Y
2
G
)
−
(
E
(
Y
G
)
2
) =
Y
2
−
(
Y
)
2
= 0
.
b)
Show that
Var(
Y
) =
E
(Var(
Y
G
)) + Var(
E
(
Y
G
))
.
ANS.
Using the tower property and linearity of CE gives
E
(Var(
Y
G
)) =
E
[
E
(
Y
2
G
)
−
(
E
(
Y
G
)
2
)] =
E
(
Y
2
)
−
E
[
E
(
Y
G
)
2
]
.
The definition of (unconditional) variance and the tower property implies
Var(
E
(
Y
G
)) =
E
[(
E
(
Y
G
))
2
]
−
(
E
[
E
(
Y
G
)])
2
=
E
[(
E
(
Y
G
))
2
]
−
(
E
[
Y
])
2
.
Adding the above two equations yields
E
(Var(
Y
G
)) + Var(
E
(
Y
G
)) =
E
(
Y
2
)
−
(
E
(
Y
))
2
= Var(
Y
)
.
c)
Suppose that
Y
is
G
measurable and
X
is a random variable on (Ω
,
F
,
P
) with
E
(
X
2
)
<
∞
. Show that
Var(
XY
G
) =
Y
2
Var(
X
G
)
.
ANS.
Since
Y
is
G
measurable, taking out what is known yields
Var(
XY
G
) =
E
((
XY
)
2
G
)
−
(
E
(
XY
G
)
2
) =
Y
2
E
(
X
2
G
)
−
(
Y
E
(
X
G
))
2
=
Y
2
Var(
X
G
)
.
2.
(3 Points each) Consider the probability space (Ω
,
F
,
P
), where Ω = (0
,
1);
F
is the Borel
σ
field on (0
,
1), that is,
F
=
σ
(
{
(
a, b
) : 0
< a < b <
1
}
); and
P
is the uniform probability
measure. For
n
= 1
,
2
, . . .
define
X
n
(
ω
) = 2
n
I
A
n
(
ω
),
ω
∈
Ω, where
A
n
=
parenleftbigg
1
2
−
1
2
n
,
1
2
+
1
2
n
parenrightbigg
.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '08
 2
 Probability theory, Probability space, Xn, Dominated convergence theorem

Click to edit the document details