Joint distributions (Pitman 3.1 and 5.2)
So far we have considered a single random variable by itself, but in many situations we need to know
the relationship between several random variables. For example, if we are interested in studying
obesity in chi
Independent random variables
Recall that two events A and B are said to be independent if
P(A B) = P(A)P(B).
What does it mean for random variables X and Y to be independent? Intuitively, it means that X
contains no information about Y , and vice versa.
Chebyshevs inequality (Pitman 3.3)
Theorem (Chebyshev): Let X be a random variable with mean E(X) = and variance Var(X) =
2 . Then, for any constant k,
P(|X | k)
P(|X | < k) 1
Proof: (Can be skipped) Chebyshevs inequal