1
Chebyshevs inequality (Pitman 3.3)
Theorem (Chebyshev): Let X be a random variable with mean E(X) = and variance Var(X) =
2 . Then, for any constant k,
P(|X | k)
1
,
k2
or, equivalently,
P(|X | < k) 1
1
.
k2
Proof: (Can be skipped) Chebyshevs inequal

1
Independent random variables
Recall that two events A and B are said to be independent if
P(A B) = P(A)P(B).
What does it mean for random variables X and Y to be independent? Intuitively, it means that X
contains no information about Y , and vice versa.

1
Joint distributions (Pitman 3.1 and 5.2)
So far we have considered a single random variable by itself, but in many situations we need to know
the relationship between several random variables. For example, if we are interested in studying
obesity in chi