lect19 - EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION Outline EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE Expectation of Sums COVARIANCE OF SUMS CORRELATION The Bivariate Normal Case 1 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION Expectation of Sums Expectation of a Function of a Bivariate Random Variable • Suppose X , Y have joint pmf/density f (x , y ), and g (x , y ) is a function of two variables, then E (g (X , Y )) g (x , y ) · f (x , y ), x ,y = g (x , y ) · f (x , y ) dx dy , x 2 / 15 if they’re discrete, (1) if they’re continuous. y Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION Expectation of Sums EXAMPLE: MEAN DISTANCE • An accident occurs at a location X that’s uniformly distributed on a road of length 1. • At the time of the accident, an ambulance is at a location Y that’s also uniformly distributed on the road. • Assume that X and Y are independent, find the expected distance between the two locations. • The distance = g (X , Y ) = . • X and Y have joint density f (x , y ) = • Hence the expected distance for 0 ≤ x , y ≤ 1. 1 1 |x − y | · 1 dx dy E (g (X , Y )) = 0 0 1 y 1 (y − x ) dx + = 0 0 (x − y ) dx dy y 1 (1 − y )2 y2 + 2 2 0 1 1 1 =+=. 6 6 3 • What’s the distribution of |X − Y |? = 3 / 15 Xinghua Zheng dy Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION Expectation of Sums EXPECTATION OF SUMS • One important application of formula (1) on page 2 is to let g (x , y ) = x + y , in which case we get (in the continuous case) (x + y )f (x , y ) dx dy E (X + Y ) = x y x· = x f (x , y ) dy y y x · fX (x ) dx + = x y· dx + f (x , y ) dx dy x y · fY (y ) dy y = E (X ) + E (Y ), i.e., E (X + Y ) = E (X ) + E (Y ), (just the same as in the discrete case.) • Generalizations: E ( n=1 Xi ) = n=1 E (Xi ). i i 4 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION Expectation of Sums EXAMPLES • If X1 , . . . , Xn are i.i.d. exponential random variables with parameters λ, then their sum S = • The distribution of S is n i =1 Xi has mean . . • If Z1 , . . . , Zn are i.i.d. standard normal, then their sum S= n 2 i =1 Zi has mean • The distribution of S is . = . • What if the random variables are not independent in the above examples? • If X , Y are i.i.d. positive random variables, then E (X /(X + Y )) = : • Let Rx = X /(X + Y ) and Ry = Y /(X + Y ). Any connections between Rx and Ry ? • Rx + Ry = , so • Do Rx and Ry have the same distribution? the same expectation? Hence E (Rx ) = • Are Rx and Ry independent? 5 / 15 Xinghua Zheng Do they have . Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION COVARIANCE, & COVARIANCE OF SUMS • The covariance between two random variables X and Y is Cov(X , Y ) : = E ((X − E (X )) · (Y − E (Y ))) = E (XY ) − E (X )E (Y ). • Some properties of covariance: 1. Cov(X , Y ) = Cov(Y , X ) 2. Cov(X , X ) = Var(X ) 3. Cov(aX + b, cY + d ) = ac · Cov(X , Y ) 4. Cov 5. Var 6 / 15 m n m n i =1 Xi , j =1 Yj = i =1 j =1 Cov(Xi , Yj ) m m i =1 Xi = i =1 Var(Xi ) + 2 1≤i <j ≤m Cov(Xi , Xj ) Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION THE INDEPENDENCE CASE • Suppose X and Y are independent, and g (x ) and h(y ) are two functions, then E (g (X )h(Y )) = E (g (X )) · E (h(Y )). • =⇒ if X and Y are independent, then Cov(X , Y ) = = 0, Var(X + Y ) = . • Generalizations: suppose X1 , . . . , Xn are independent, then n n Xi Var i =1 7 / 15 Xinghua Zheng Var(Xi ). = i =1 Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION SAMPLE MEAN AND SAMPLE VARIANCE • Suppose X1 , . . . , Xn are i.i.d. with mean µ and variance σ 2 . • Define the sample mean X and sample variance S 2 by n X n (X −X )2 X = i =1 i , S 2 = i =1n−i1 . n • What’s E (X )? Var(X )? E (S 2 )? 1. E (X ) = =µ 2. Var(X ) = = σ 2 /n 3. E (S 2 ): we have the following identities n (Xi − µ)2 − n(X − µ)2 i =1 S2 = n 2 Xi2 − nX (n − 1), (n − 1) i =1 by which we get E (S 2 ) = = • Implications: X and S 2 are unbiased estimators of µ and σ 2 respectively; moreover, as n → ∞, X becomes more and more concentrated around µ. 8 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCE OF SUMS CORRELATION SAMPLE MEAN AND SAMPLE VARIANCE, ctd • Let X1 , . . . , Xn and X be as in the previous slide. • What’s Cov(Xi − X , X ) for each i ? Cov(Xi − X , X ) = = = . • Does this imply that Xi − X and X are independent? For example, when Xi ’s are i.i.d. Bernoulli(p)? 9 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCECase The Bivariate Normal OF SUMS CORRELATION CORRELATION • Suppose X and Y are two random variables with finite variances. Then the correlation of X and Y , denoted by ρ(X , Y ), is ρ(X , Y ) = √ Cov(X ,Y ) Var(X )Var(Y ) • Correlation ρ(X , Y ) is always between −1 and 1: 0 ≤ Var X σx ± Y σy = 2(1 ± ρ(X , Y )). • What’s ρ(X , aX + b)? ρ(X , aX + b) = , if a > 0; , if a = 0; , if a < 0. • Suppose Z is a random variable with standard deviation εσx , and is independent of X . What’s ρ(X , aX + b + Z )? • Correlation coefficient ρ(X , Y ) is a measure of linear dependence between two variables X and Y . 10 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCECase The Bivariate Normal OF SUMS CORRELATION ILLUSTRATION OF DIFFERENT ρ’s rho = 0.9 rho = 0.6 −3 −2 −1 0 1 2 1 Y −1 q 3 q q q q q q q q q q qq q q q q q q q q q qq q q qq q q q q q q qq q q q q q qq qq q q qqq qq q qqqqq q q q q q q q qq q qqq q q q q q qq qq q qq q qqqq qq q q q q q qq q q q qqqqq qq qq qq q q qq qq qq q q qq q q q q q q q q q q qq q q qqqqq qqqqqqqqqqq q qq qq q q qq q q q q qq qq q qq qqq q q q q qqqq q qq q q q q q qq q q q qq q q qqq q qqq q q q qq q qq q q q q q qq qqqqq qq qqqqqqqqqq q qq q qq qq qq q q q q q qqqq q q q q qqqqqqq qqqqqqqqq q q q q qqq q q qq q q qq qqqqqqq qqqqqqqqqqq q q q q qqqq q q q q q q q qq qq qqq qq qqqq q q q qqq q q q q qq qq qq q q q q q q qqq q qqqqq q qqqqq qqqq q q q q q q q qq q qq q q qq q q q q q qq qq qqqqq qqq q q q q q q q qq q qqqq qqqq qqqq qqqqqqqqqqq q q qqq q q qqqqqq qqq qqqq qqqqq qqq q qq q q qqqq qq q qq qqqqq q q q q qqq qqq q qq qqqq q q q q q q q q qqq q q q q qqqqqq q qqqqq qqq q qq q q qqq qq q qq qqq q q q q q q qqqq qqq q q q qq q q q qqqqqqqqq q qqqqqqq qq q q qqq q q q q q qq q qq q q q qq q q q qq q q qq q qqq qq qqqqq qq q qqq qq qqq q q q q q qqq q q qq qq q qq q q qqqqqqqq qqqqqqqqqqq q qqq q q qq q q q q q q q qq qqqqq q q q qq q q q q q qq q q q q q qqqq qq qqq q q q q qq qq qq q qqq qq qq qq q q qq qq q q q qq q q q q q qq q q q qqq qq q q q q q q q qq q qq q q q qqq q q q q qqq qq q q q q q q q q q qqq qq q q q q qq q qq qq q q q q q q qq q q 2 q q qq qq q q q q q qq q qq qq q q q q qq q qq qq q q qq q q q qq q q qq q qq qq qq qq q q q qq qq qqq q q q q qq q qq q qqqq q qq q q q qq q q q q qq q q qq qqq q qqqqq q q q q qq q q qqqqq qq q q qq qq q q qq q q qqqqqqq qqq qqq q q qqqqqqqq qqqq qq q qqqqq q q q q qqqqq qqqqqqqqqq qqq q q q q q qq qqqqqqqq qq q q qq qq qq qqqqqq q q qq qq q qqqqqqqqqqqq qqqq q q q q qqqqq q q q qqqqq q q qqqqqqqqq qq q q q qq qq q qqqqqqqqqqqq qq q qqq q qqqq qqq q q qq qq q q qq qqqqqqqqqqq q q qq q q qqqq q qq q qq q q q q qqq qq qqqqqqq qq qqq q q q q qq q q q qqq q q q qqqqqq qqqqqqqqq q q q q q qq qqqq q q q qq q qqqq qqqqq qqq q qq qqq q qq qq q q q qqqqqqqqqqqqqqq q q q qq q q q q qqq q q q q qqqq qqq q q q q q qqqqq qqqqqqqqq q qq q q qq q qqq qq q q q q qqq qqqqqqqqqqqqq qqq qq q q q q q q qqqqqqqq q q q qq q qqqq qq q q q q qq qq q qqqqqq q q q q qq qq q qqq q qq qq q qq q qq qq qq qqq qq q q q q q q qqq qq q q qqqqqqqqqq qq qq q q qq q q qq qqq q qq q q qq q q qqqqqqq q q q q q q qqq qq q q q q qqq qq q q q q q q qq q qq q qq q q q q q qqq q q q qq q qq q q q qqqq q q q q qq q q q q q −3 q 3 3 2 1 Y −1 0 −3 q q q 4 −3 −2 −1 0 X 0 3 1 2 2 1 Y 0 q q 3 −2 q q X 11 / 15 q q qq q q q qq q q qq q qq q q qqq q q qqq q q q q q q q qq q q q q q q qq qq qq q q q q q qq qq q q q q q q q q qq q q q q q q q q qqq q q qqqq qq q q qq q q q qq qqq qq q qq qqqq qqq q q qq qq q qq q q q qq q q q q q q qqqqqq q qq q q q q qq qq q qq q qqqqq q qq qqq qqqqq qqqq q q q qq q q q qq q q q q q q qq q q q q qqqq q q qq q q qq q q qq q q qqqq qq q q qq q q qqq qqq qq qqqqqq qqqqq qqq qqq q qq q q q q q qq q q qqqqqq qqqqqq qqqqq q q qqq q q q qq q q qqq q q q q qq q q q q q q q qqqq q q q qqq q q q qq qq qqq qqqqq qqqqqqqqqq q qqqqqq qqqq q qq q q q qqq q q qq q q qq qq q qq q qq q q qq q q q qq qqq qqqqqqqqqq qqqqqqqq q q q q q q qq q q qqqq q q qqq qqq qq q q q q q qq qq qq qq qqqq qqqqqqqqqq q q q q q qq q q q qqq q q qq q qqq q qq qq q qqqqq qq qqqqqqqqqqqqqqq q qq q qqq q q q q q qq q q q qq q qqqqqqq qqqqq q q qqq q qq q q qq q qq qq q q qq qq q q q q qq q q qq qq q q qq q q qq qq q qq qqq qqqq q q q qq q q q qqqq q qqqqqqqq qqqq qqqq qq qq q q q q qq q q q qq q qq qq q q qqqq qqqqqqqqqqq qqqq qqq q q qq q q qq q q q qq qq qq q q qq qqqq q q q q q q qq q q qqqqqqq qq q qq q q q q qq q q q q qqq q q q qq q q qq q q q q q q q qqqqq q q qq q q q q qq q qq q q q q q q q q q qq qq q qq qq q q qq q q q q qqq q q qq q qq q q qq q q q q q q qq q q q q q q q q q q qq q q q q qq q q −4 3 2 1 Y −1 −3 qq q q q q q q q qq q q q q q q q q q qqq q q qq q q qqq q qq q q q q qq q qq q q q q qqq qq q qq q qq q qq q q q q qq q q q q qq q q q q q q q qqq q qqq qqqqq qq q q qqqq qq q qq q q q q q qq q qqq qq q qq q qqq q qq q qqqqqq qqqq q q qqq q q qqq q q qq qq q q qq q qq q qq q qq qq qqq q qqq qq q qq qq qqq qq q qq q qqq qqq qq qq q qq q q q q qq q q q q q q q qq qq q q qq qq q qqq qqq qqqqq qqq q qqq qqq q q qq q qqq q q q q qq qq q q qq q qq q q qq q q q q qqq qqqqqqq q qqqq q q qq qq q q q q qq q q q q q qq q q q q qqqqqqqqqqqqq qq qqqqqqqqqq qqq qqq q q q q q qq q q q q q q q q q q qq q q q q q q q q q q qq qqqq q qqq q q q q qq q q q qqqq qq qqqq q q q q qq q q q q q q q q q q q qq q q qq q q qq q qqq q q q q q q qq q q qqqqqqqqqq qqqqqqqq qqq qqqq qq q q q q q q qq qq q q qq qqqq qq qqq qqq q q q q qq q q q qq q qqqqqq qq q q qq q q q q q q q qq q q q q qq q q qq qq q qq q q q q qq q q q q qq q qq q q q q q qq qq q q qq q qqqqq q qqq qqq q qqqqq qqq q qq qqq qqq q q qq q qq q q q qqq q q q qq qq q qqq qq q q q q qqq q q qq q q q qq q q qq qq q qq q q q q qq qq qqqq qqqqqq qqq q qq qq qqqq qq qq q q q q qqqq q qq qq q q q q qq q qq q qq q q q q q qqq q qq qq q q q q q qqq q q q q q q q q qq q qq qqq q q qq qqqqq q q q qq q q q qq q qq q q qq q q q q qq qq q q q q q q q qq qqq q qq q q q q q q q q qq q qq q q qq q q q q q qq q q qqq q q q q qqq qqq qq qqq q q qq qq q q q q qq qq qq q q qq qq q q q q q qq q q q −1 3 rho = −0.3 q q −2 2 q X rho = 0 q 1 q q q q −3 −2 −1 0 1 2 3 X Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCECase The Bivariate Normal OF SUMS CORRELATION UNCORRELATEDNESS VS INDEPENDENCE • If ρ(X , Y ) = 0, then X and Y are said to be uncorrelated. • Examples 1. If X and Y are independent, then Cov(X , Y ) = 0 and so they’re uncorrelated 2. Cov(Xi − X , X ) = 0 and so they’re uncorrelated • Uncorrelated but dependent? Examples: 1. Xi − X and X when Xi ’s are i.i.d. Bernoulli(p) 2. Suppose X is random variable such that E (X ) = E (X 3 ) = 0, then Cov(X , X 2 ) = = 0, so they’re uncorrelated. Are they independent? 12 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCECase The Bivariate Normal OF SUMS CORRELATION CORRELATIONS FOR BIVARIATE NORMALS • Suppose X and Y are jointly normal with parameters µx = µy = 0, σx = σy = 1 and some −1 < ρ < 1, i.e., their joint density is given, for −∞ < x , y < ∞, by fX ,Y (x , y ) = √1 2π 1−ρ2 1 exp − 2(1−ρ2 ) x 2 − 2ρxy + y 2 . • What’s Cov(X , Y )? ρ(X , Y )? • Method I: known that X ∼ N (0, 1), Y ∼ N (0, 1), so Cov(X , Y ) = E (XY ) − 0 = x y xy · fX ,Y (x , y ) dx dy — complicated integration but doable • Method II: Let (U , V ) = (X , Y − ρX ) 1. By the 2D change-of-variable formula for densities, U and V have joint density fU ,V (u , v ) = hence U ∼ 13 / 15 ,V ∼ Xinghua Zheng , −∞ < u , v < ∞, , and U and V are Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCECase The Bivariate Normal OF SUMS CORRELATION CORRELATIONS FOR BIVARIATE NORMAL, ctd 2. Cov(X , Y ) = Cov(U , V + ρU ) = 3. ρ(X , Y ) = Cov(X , Y )/ Var(X )Var(Y ) = =ρ = ρ. • What if X and Y are jointly normal with general parameters µx , µy , σx , σy and ρ? 1. Let (X1 , Y1 ) = ((X − µx )/σx , (Y − µy )/σy ). Compute the joint density of (X1 , Y1 ) (Exercise). 2. ... 3. Cov(X , Y ) = ρσx σy , ρ(X , Y ) = ρ. • Hence if X and Y are jointly normal, then they’re uncorrelated if and only if ρ = 0. However, in Lect 17 we’ve seen that they’re independent if and only if ρ = 0. Therefore — • Conclusion: If X and Y are jointly normal, then they are independent if and only if they’re uncorrelated • Caution: The condition that X and Y are jointly normal CANNOT be weakened to that X and Y are marginally normal. Can you come up with a counterexample? 14 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations EXPECTATION OF A FUNCTION OF A BIVARIATE RANDOM VARIABLE COVARIANCECase The Bivariate Normal OF SUMS CORRELATION An Example of Marginally Normal but Not Jointly Normal • Let X ∼ N (0, 1), B ∼Bernoulli(1/2) independent of X , and let Y = X · (2B − 1). • Then X and Y are both marginally normal. Why? • X ∼ N (0, 1) as assumed • Y : for any y , P [Y ≤ y ] = P [X · (2B − 1) ≤ y ] = = P [X ≤ y ], hence Y ∼ N (0, 1) • Are X and Y uncorrelated? • Are X and Y independent? • Are X and Y jointly normal? 15 / 15 Xinghua Zheng Lect 19: Expectation, Covariance, Variance of Sums; Correlations ...
View Full Document

This note was uploaded on 11/27/2011 for the course ISOM 3540 taught by Professor Zheu during the Spring '11 term at HKUST.

Ask a homework question - tutors are online