elemprob-fall2010-page40

elemprob-fall2010-page40 - The covariance of two random...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: The covariance of two random variables X and Y is defined by Cov (X, Y ) = E [(X − E X )(Y − E Y )]. As with the variance, Cov (X, Y ) = E (XY )−(E X )(E Y ). It follows that if X and Y are independent, then E (XY ) = (E X )(E Y ), and then Cov (X, Y ) = 0. Note Var (X + Y ) = E [((X + Y ) − E (X + Y ))2 ] = E [((X − E X ) + (Y − E Y ))2 ] = E [(X − E X )2 + 2(X − E X )(Y − E Y ) + (Y − E Y )2 ] = Var X + 2Cov (X, Y ) + Var Y. We have the following corollary. Proposition 15.2 If X and Y are independent, then Var (X + Y ) = Var X + Var Y. Proof. We have Var (X + Y ) = Var X + Var Y + 2Cov (X, Y ) = Var X + Var Y. Since a binomial is the sum of n independent Bernoulli’s, its variance is np(1 − p). If we write X = n=1 Xi /n and the Xi are independent and have i the same distribution (X is called the sample mean), then E X = E X1 and Var X = Var X1 /n. We define the conditional expectation of X given Y by E [X | Y = y ] = 40 xfX |Y =y (x)dx. ...
View Full Document

This note was uploaded on 12/29/2011 for the course MATH 316 taught by Professor Ansan during the Spring '10 term at SUNY Stony Brook.

Ask a homework question - tutors are online