This preview shows page 1. Sign up to view the full content.
Unformatted text preview: The covariance of two random variables X and Y is deﬁned by
Cov (X, Y ) = E [(X − E X )(Y − E Y )].
As with the variance, Cov (X, Y ) = E (XY )−(E X )(E Y ). It follows that if X
and Y are independent, then E (XY ) = (E X )(E Y ), and then Cov (X, Y ) =
0.
Note
Var (X + Y ) = E [((X + Y ) − E (X + Y ))2 ]
= E [((X − E X ) + (Y − E Y ))2 ]
= E [(X − E X )2 + 2(X − E X )(Y − E Y ) + (Y − E Y )2 ]
= Var X + 2Cov (X, Y ) + Var Y.
We have the following corollary.
Proposition 15.2 If X and Y are independent, then
Var (X + Y ) = Var X + Var Y.
Proof. We have
Var (X + Y ) = Var X + Var Y + 2Cov (X, Y ) = Var X + Var Y. Since a binomial is the sum of n independent Bernoulli’s, its variance is
np(1 − p). If we write X = n=1 Xi /n and the Xi are independent and have
i
the same distribution (X is called the sample mean), then E X = E X1 and
Var X = Var X1 /n.
We deﬁne the conditional expectation of X given Y by
E [X  Y = y ] = 40 xfX Y =y (x)dx. ...
View
Full
Document
This note was uploaded on 12/29/2011 for the course MATH 316 taught by Professor Ansan during the Spring '10 term at SUNY Stony Brook.
 Spring '10
 ansan
 Covariance, Variance

Click to edit the document details