CS 70
Discrete Mathematics and Probability Theory
Fall 2010
Tse/Wagner
Lecture 16
Variance
Question:
At each time step, I ﬂip a fair coin. If it comes up Heads, I walk one step to the right; if it comes
up Tails, I walk one step to the left. How far do I expect to have traveled from my starting point after
n
steps?
Denoting a rightmove by
+
1 and a leftmove by

1, we can describe the probability space here as the set
of all words of length
n
over the alphabet
{±
1
}
, each having equal probability
1
2
n
. For instance, one possible
outcome is
(+
1
,
+
1
,

1
,...,

1
)
. Let the r.v.
X
denote our position (relative to our starting point 0) after
n
moves. Thus
X
=
X
1
+
X
2
+
···
+
X
n
,
where
X
i
=
(
+
1
if
i
th toss is Heads;

1
otherwise.
Now obviously we have
E
(
X
) =
0. The easiest way to see this is to note that
E
(
X
i
) = (
1
2
×
1
)+(
1
2
×
(

1
)) =
0, so by linearity of expectation
E
(
X
) =
∑
n
i
=
1
E
(
X
i
) =
0. Thus after
n
steps, my expected position is 0. But
of course this is not very informative, and is due to the fact that positive and negative deviations from 0
cancel out.
What the above question is really asking is: What is the expected value of

X

, our
distance
from 0? Un
fortunately, computing the expected value of

X

turns out to be a little awkward, due to the absolute value
operator. Therefore, rather than consider the r.v.

X

, we will instead look at the r.v.
X
2
. Notice that this also
has the effect of making all deviations from 0 positive, so it should also give a good measure of the distance
traveled. However, because it is the
squared
distance, we will need to take a square root at the end.
Let’s calculate
E
(
X
2
)
:
E
(
X
2
) =
E
((
X
1
+
X
2
+
···
+
X
n
)
2
)
=
E
(
n
∑
i
=
1
X
2
i
+
∑
i
6
=
j
X
i
X
j
)
=
n
∑
i
=
1
E
(
X
2
i
)+
∑
i
6
=
j
E
(
X
i
X
j
)
In the last line here, we used linearity of expectation. To proceed, we need to compute
E
(
X
2
i
)
and
E
(
X
i
X
j
)
(for
i
6
=
j
). Let’s consider ﬁrst
X
2
i
. Since
X
i
can take on only values
±
1, clearly
X
2
i
=
1 always, so
E
(
X
2
i
) =
1.
What about
E
(
X
i
X
j
)
? Well,
X
i
X
j
= +
1 when
X
i
=
X
j
= +
1 or
X
i
=
X
j
=

1, and otherwise
X
i
X
j
=

1.
Also,
Pr
[(
X
i
=
X
j
= +
1
)
∨
(
X
i
=
X
j
=

1
)] =
Pr
[
X
i
=
X
j
= +
1
]+
Pr
[
X
i
=
X
j
=

1
] =
1
4
+
1
4
=
1
2
,
so
X
i
X
j
=
1 with probability
1
2
. Therefore
X
i
X
j
=

1 with probability
1
2
also. Hence
E
(
X
i
X
j
) =
0.
CS 70, Fall 2010, Lecture 16
1