elemprob-page23 - Cov ( X,Y ) = E [( X-E X )( Y-E Y )] . As...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
11. Expectations. As in the one variable case, we have E g ( X,Y ) = XX g ( x,y ) p ( x,y ) in the discrete case and E g ( X,Y ) = Z Z g ( x,y ) f ( x,y ) dxdy in the continuous case. If we set g ( x,y ) = x + y , then E ( X + Y ) = Z Z ( x + y ) f ( x,y ) dxdy = Z Z xf ( x,y ) dxdy + Z Z yf ( x,y ) dxdy. If we now set g ( x,y ) = x , we see the first integral on the right is E X , and similarly the second is E Y . Therefore E ( X + Y ) = E X + E Y. Proposition 11.1. If X and Y are independent, then E [ h ( X ) k ( Y )] = E h ( X ) E k ( Y ) . In particular, E ( XY ) = ( E X )( E Y ) . Proof. By the above with g ( x,y ) = h ( x ) k ( y ), E [ h ( X ) k ( Y )] = Z Z h ( x ) k ( y ) f ( x,y ) dxdy = Z Z h ( x ) k ( y ) f X ( x ) f Y ( y ) dxdy = Z h ( x ) f X ( x ) Z k ( y ) f Y ( y ) dy dx = Z h ( x ) f X ( x )( E k ( Y )) dx = E h ( X ) E k ( Y ) . The covariance of two random variables X and Y is defined by
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Cov ( X,Y ) = E [( X-E X )( Y-E Y )] . As with the variance, Cov ( X,Y ) = E ( XY )-( E X )( E Y ). It follows that if X and Y are independent, then E ( XY ) = ( E X )( E Y ), and then Cov ( X,Y ) = 0. Note Var ( X + Y ) = E [(( X + Y )-E ( X + Y )) 2 ] = E [(( X-E X ) + ( Y-E Y )) 2 ] = E [( X-E X ) 2 + 2( X-E X )( Y-E Y ) + ( Y-E Y ) 2 ] = Var X + 2Cov ( X,Y ) + Var Y. We have the following corollary. 23...
View Full Document

This note was uploaded on 12/29/2011 for the course MATH 317 taught by Professor Wen during the Spring '09 term at SUNY Stony Brook.

Ask a homework question - tutors are online