Conditional expectation
Jason Swanson
April 17, 2009
1
Conditioning on
σ
algebras
Let (Ω
,
F
, P
) be a probability space and let
A
∈ F
with
P
(
A
)
>
0. Define
Q
(
B
) =
P
(
B

A
) =
P
(
B
∩
A
)
P
(
A
)
,
for all
B
∈ F
.
It is easy to to check that
Q
is a probability measure on (Ω
,
F
). If
X
is a random variable,
we define the
conditional expectation of
X
given
A
as
E
[
X

A
] =
X dQ,
(1.1)
whenever this integral is welldefined. Note that
E
[1
B

A
] =
P
(
B

A
).
Theorem 1.1.
If
E
[

X

1
A
]
<
∞
, then
X
is
Q
integrable. If
X
≥
0
or
E
[

X

1
A
]
<
∞
, then
E
[
X

A
] =
E
[
X
1
A
]
P
(
A
)
.
(1.2)
Remark 1.2.
Note that (1.2) may be written as
E
[
X

A
] =
α
(
A
)
P
(
A
)
,
(1.3)
where
dα
=
X dP
. Also note that (1.2) gives us the formula
E
[
X
1
A
] =
P
(
A
)
E
[
X

A
]. If
X
= 1
B
, then this reduces to the familiar multiplication rule,
P
(
A
∩
B
) =
P
(
A
)
P
(
B

A
).
Proof of Theorem 1.1.
Note that if
P
(
B
) = 0, then
Q
(
B
) = 0. Hence
Q
P
. Also note
that
Q
(
B
) =
B
1
A
P
(
A
)
dP,
for all
B
∈ F
.
Thus,
dQ/dP
= 1
A
/P
(
A
). It follows that if
X
≥
0, then
E
[
X

A
] =
X dQ
=
X
dQ
dP
dP
=
E
X
1
A
P
(
A
)
=
E
[
X
1
A
]
P
(
A
)
.
Therefore, if
E
[

X

1
A
]
<
∞
, then
X
is
Q
integrable, and the same formula holds.
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Lemma 1.3.
Any finite
σ
algebra
F
on
Ω
can be written as
F
=
σ
(
{
A
j
}
n
j
=1
)
, where
{
A
j
}
n
j
=1
is a partition of
Ω
, that is,
Ω =
˙
n
j
=1
A
j
. Moreover, the partition
{
A
j
}
n
j
=1
is unique.
Proof.
For each
ω
∈
Ω, let
A
ω
be the smallest measurable set containing
ω
.
That is,
A
ω
=
F
ω
, where
F
ω
=
{
A
∈ F
:
ω
∈
A
}
. Since this is a finite intersection,
A
ω
∈ F
. In
particular,
E
=
{
A
ω
:
ω
∈
Ω
}
is a finite set. We claim that
E
is a partition of Ω and that
F
=
σ
(
E
).
Clearly, Ω =
ω
∈
Ω
A
ω
, so to show that
E
is a partition, it suffices to show that this is a
disjoint union. More specifically, we wish to show that if
ω, ω
∈
Ω, then either
A
ω
=
A
ω
or
A
ω
∩
A
ω
=
∅
. Let
ω, ω
∈
Ω. Note that for any
A
∈ F
, if
ω
∈
A
, then
A
∈ F
ω
, which
implies
A
ω
⊂
A
. Hence, if
ω
∈
A
ω
, then
A
ω
⊂
A
ω
; and if
ω
∈
A
c
ω
, then
A
ω
⊂
A
c
ω
. That is,
either
A
ω
⊂
A
ω
or
A
ω
⊂
A
c
ω
. By symmetry, either
A
ω
⊂
A
ω
or
A
ω
⊂
A
c
ω
. Taken together,
this shows that either
A
ω
=
A
ω
or
A
ω
∩
A
ω
=
∅
.
To see that
F
=
σ
(
E
), simply note that any
A
∈ F
can be written as
A
=
ω
∈
A
A
ω
, and
that this is a finite union.
For uniqueness, suppose that
F
=
σ
(
{
B
j
}
n
j
=1
), where Ω =
˙
n
j
=1
B
j
.
If
ω
∈
B
j
, then
A
ω
=
B
j
. Therefore,
E
=
{
B
j
}
n
j
=1
.
Exercise 1.4.
Show that every infinite
σ
algebra is uncountable.
Let (Ω
,
F
, P
) be a probability space and
X
an integrable random variable. Let
G ⊂ F
be a finite
σ
algebra.
Write
G
=
σ
(
{
A
j
}
n
j
=1
), where
{
A
j
}
n
j
=1
is a partition of Ω.
The
conditional expectation of
X
given
G
, written
E
[
X
 G
], is a random variable defined
by
E
[
X
 G
](
ω
) =
E
[
X

A
1
]
if
ω
∈
A
1
,
E
[
X

A
2
]
if
ω
∈
A
2
,
.
.
.
E
[
X

A
n
]
if
ω
∈
A
n
.
Note that we may write
E
[
X
 G
] =
n
j
=1
E
[
X

A
j
]1
A
j
.
(1.4)
We also define the
conditional probability of
A
given
G
as
P
(
A
 G
) =
E
[1
A
 G
]. Note
that
P
(
A
 G
)(
ω
) =
P
(
A

A
j
) if
ω
∈
A
j
.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '09
 Algebra, Probability, Probability theory, Stochastic process, Conditional expectation, Aω

Click to edit the document details