distribution - ECO2121: Methods of Economic Statistics...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ECO2121: Methods of Economic Statistics Additional Examples on Distributions (joint, marginal, conditional) I type this for two reasons: (1) as an example of distributions, and (2) as an example of integration. I hope this can help as some of you are not familiar with integration, let alone integration with multi-variables. A similar question is in assignment 6, therefore you might use this example as a reference too. I’d illustrated the concept of these distributions with two discrete random variables in TA13 and TA13a, you might refer to them before going through to the following technical details using integration, instead of summation, for continuous random variables. Suppose two random variables X and Y are distributed jointly with a joint pdf fX,Y(x,y) = A(x + y) for 0 < x < 2, 0 < y < 1, where A is a constant. (a) We know the “sum” of all the probabilities equals to 1, but summing these probabilities for continuous random variables needs integration: = 1= (+) , (, ) Here y is treated as “constant ” in the integral of x, just like in the case of partial differentiation. 1= 2 + ×( | ) = [2 + 2 ] = [2 + | ]=3 = = (+) + So, we get A = 1/3 such that fX,Y(x,y) = (x + y)/3 for 0 < x < 2, 0 < y < 1. ( )= = 1 3 (, ) 2 = + 3 = 1 3 + (b) The marginal distributions of X and Y are fX(x) and fY(y): , Note the range of values of x, i.e. 0 < x < 2, is important. Since we are looking at the “margin” of X alone, thus only the range of values of x is specified. ×|+ = 1 3 + 1 1+2 = 2 6 ,0 < <2 = ( )= 1 32 + , 1 2(1 + ) × ( | ) = [2 + 2 ]= 3 3 (, ) = + 3 = 1 3 ,0 < + <1 We see fX(x) is a function of x only, and fY(y) is a function of y only, because we are looking at the margin. We also see that fX,Y(x,y) ≠ fX(x) × fY(y), so X and Y are NOT independent. Furthermore, we see that () = 1+2 6 = 1 6 1+2 1+ 1 =[+ 6 = 2 3 | ]=1 because fX(x) is also a pdf, yet it just looks at the angle of X alone. Similarly, the integral of fY(y) is 1 too, e.g. () = 2(1 + ) 3 = 2 3 + 2 =1 (c) The conditional distributions fY|X(y|x) and fX|Y(x|y): ( + )⁄3 2( + ) ,(, ) = = | ( | )= () (1 + 2 )⁄6 1 + 2 | Note the range of values of x and y, i.e. 0 < x < 2 and 0 < y < 1, is important too. ( | )= , Furthermore, we see that | (, ) ( + )⁄3 + = = () 2(1 + )⁄3 2(1 + ) (|) + = 2( + ) 1+2 2 1+2 = ,0 < < 2, 0 < <1 Here x is again treated as “constant ”, thus possible to be taken out of the integral of y = 2 1+2 = + 2 = 2 1+2 + 2 1+2 ,0 < + < 2, 0 < <1 because fY|X(y|x) is also a pdf, yet it just looks at the angle of Y conditional on X only. Similarly, the integral of fY|X(y|x) is 1 too, e.g. 1 =1 2 = [ ]= | 1 + 2(1 + ) 2 × () (|) = + 2(1 + ) × = (d) Expectation E[X] and E[Y]: [ ]= × = 2 1 + 3 62 = () 2 + 3 32 = = = 1+2 6 1 [2 + 2 ] = 1 2(1 + ) = 1 6 +2 + = 1 2(1 + ) + = 2(1 + ) 3 1 16 11 2+ = 6 3 9 21 1 5 += 32 3 9 = = 2 3 (e) Conditional Expectation E[X|Y] and E[Y|X]: + [ | ]= × |(|) = 2(1 + ) = 1 2(1 + ) = 1 8 +2 2(1 + ) 3 + = 4+3 3(1 + ) = 1 2(1 + ) 3 1 2(1 + ) + <1 Note the range of values of y, i.e. 0 < y < 1, is important because E[X|Y] is a random variable of Y, i.e. different values of y would give different values of E[X|Y] as E[X|Y = y]. [ | ]= × = | ,0 < × 2 + = 2 1+2 (|) Similarly, E[Y|X] is a random variable of X, i.e. different values of x would give 2 1+2 2 + + = 2( + ) 1+2 = 1 2+3 = 3 3(1 + 2 ) 2 1+2 = ,0 < × 2 1+2 2 <2 + 3 + different values of E[Y|X] as E[Y|X = x]. (f) Tower Expectation We verify E{E[X|Y]} = E[X] and E{E[Y|X]} = E[Y]: 4+3 4+3 [|]= = × () 3(1 + ) 3(1 + ) = 2 9 4+3 = 2+3 = 3(1 + 2 ) = 1 18 3 2 4+ 2 9 = = 1+2 2+3 × 6 3(1 + 2 ) [|]= 2+3 2+3 × 3(1 + 2 ) = 2 3 11 4+ = = 9 2 9 1 3 2+ 18 2 () = 2(1 + ) 4+3 × 3 3(1 + ) = 5 = 9 (g) Covariance Cov[X, Y] = E[XY] – E[X]E[Y] Since we have calculated E[X] and E[Y], it suffices to calculate E[XY]: + [ ]= × ,(, ) = 3 = 1 3 = 1 3 3 + (+) × 2 = 1 3 = 1 3 8 +2 3 = + 1 3 Hence, Cov[X, Y] = 2/3 – 11/9 × 5/9 = -1/81 As Cov[X, Y] < 0, we say X and Y are negatively correlated. (h) Probabilities Jointly: Pr{X > 1, Y > 0.5} = 1 3 . , = 14 2 + 33 3 = 14 2 2 += 33 3 3 8 +2 3 = . 2 (, ) + ×| = = . 1 3 . 3 + 2 + 3 = = 13 + 32 2 1 3 . + . = 13 3 3 += 34 8 8 X alone (marginally): Pr{X > 1} = () = Y alone (marginally): Pr{Y > 0.5} = . () = . 2(1 + ) 3 1+2 6 2 3 . 1 =[+ 6 1+ = 2 3 | ]= + 2 42 = 63 = = . 23 5 7 −= 32 8 12 We see Pr{X > 0, Y > 0.5} ≠ Pr{X > 0} × Pr{Y > 0.5} because X and Y are NOT independent. Taylor 11 April 2009 ...
View Full Document

This note was uploaded on 11/16/2009 for the course ECO 2121 taught by Professor Professorwen during the Fall '08 term at Al Ahliyya Amman University.

Ask a homework question - tutors are online