IEOR 4106, Spring 2011, Professor Whitt
Martingales, Gambling and Brownian Motion
April 26, 2011
1
Martingales
We start by defining a martingale, working in discrete time. This first section is an alternative,
supplement, to the notes of the last class.
Definition 1.1
Let
{
X
n
:
n
≥
0
}
and
{
Y
n
:
n
≥
0
}
be stochastic processes
(
sequences of
random variables
)
. We say that
{
X
n
:
n
≥
0
}
is a
martingale
with respect to
{
Y
n
:
n
≥
0
}
if
(
i
)
E
[

X
n

]
<
∞
for all
n
≥
0
and
(
ii
)
E
[
X
n
+1

Y
0
, Y
1
, . . . , Y
n
] =
X
n
for all
n
≥
0
.
a. More on Definition 1.1.
In Definition 1.1 we think of the stochastic process
{
Y
n
:
n
≥
0
}
constituting
the history
or information
. Then
{
Y
k
: 0
≤
k
≤
n
}
is the history up to (and including) time
n
. The
random variables
Y
k
could be random vectors, as we illustrate below.
We simply say that
{
X
n
:
n
≥
0
}
is a martingale if
{
X
n
:
n
≥
0
}
is a martingale with respect to
{
X
n
:
n
≥
0
}
;
i.e., if the history process
{
Y
n
:
n
≥
0
}
is the stochastic process
{
X
n
:
n
≥
0
}
itself. We then
also say that
{
X
n
:
n
≥
0
}
is a martingale with respect to its internal history (the history
generated by
{
X
n
:
n
≥
0
}
).
In the literature on martingales, the histories are usually characterized via sigmafields of
events, denoted by
F
n
for
n
≥
0. We know whether or not each of the events in
F
n
occurred
by time
n
. We then write instead of (ii) above:
(
ii
)
E
[
X
n
+1
F
n
] =
X
n
for all
n
≥
0
,
where
F
n
is understood to be the history up to time
n
. With that notation, we assume the
history is cumulative, starting at time 0.
Then
F
n
can be understood to be shorthand for
{
Y
k
: 0
≤
k
≤
n
}
.
b. Conditional Expectation
In order to understand the definitions above, we need to understand conditional expec
tation.
The basic concepts are reviewed in the first four sections of Chapter 3 in Ross.
In
particular, we need to know what
E
[
X

Y
] means for random variables or random vectors
X
and
Y
. For this, see p. 106 of Ross.
By
E
[
X

Y
], we mean a random variable.
In particular,
E
[
X

Y
] =
E
[
X

Y
=
y
] when
Y
=
y
. Thus
E
[
X

Y
] can be regarded as a deterministic function of the random variable
Y
,
which makes it itself be a random variable. Since (in the discrete case)
E
[
X
] =
X
y
E
[
X

Y
=
y
]
P
(
Y
=
y
) =
E
[
E
[
X

Y
]]
,
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
we have the fundamental relation
E
[
E
[
X

Y
]] =
E
[
X
]
for all random variables
X
and
Y
.
As a consequence, for a martingale
{
X
n
:
n
≥
0
}
with respect to
{
Y
n
:
n
≥
0
}
, we have
E
[
X
n
+1
] =
E
[
E
[
X
n
+1

Y
0
, Y
1
, . . . , Y
n
]] =
E
[
X
n
]
for all
n
≥
0
.
Thus, by mathematical induction, for a martingale
E
[
X
n
] =
E
[
X
0
] for all
n
≥
1.
This last
expectedvalue relation is a consequence of the martingale property, but it is not equivalent;
the martingale property implies more than that.
c. More on the Martingale Definition.
We can now say more about Definition 1.1. First, we observe that Property (i) is a technical
regularity condition, while property (ii) is the key property. It could be expressed equivalently
in two parts:
(
ii

a
)
0
E
[
X
n

Y
0
, Y
1
, . . . , Y
n
] =
X
n
for all
n
≥
0
.
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '11
 WhitWard
 Probability theory, Stochastic process, Xn

Click to edit the document details