{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

prob2_3 - (ii The correlation coefficient is a number...

Info icon This preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
(ii) The correlation coefficient is a number between - 1 and +1 – this follows again from the Cauchy-Schwarz inequality. (iii) If Y = aX + b holds with a = 0 , then Var( Y ) = a 2 Var( X ) , E ( XY ) = aE ( X 2 ) + bE ( X ) and E ( X ) E ( Y ) = a ( E ( X ) ) 2 + bE ( X ) , thus Cov( X, Y ) = a Var( X ) . It follows that ρ ( X, Y ) = 1 for a > 0 and ρ ( X, Y ) = - 1 for a < 0 . Linear dependence between X and Y hence implies that ρ ( X, Y ) attains one of the extreme values. Theorem 4.17. Let ( X, Y ) be a two-dimensional random variable with Var( X ) > 0 and Var( Y ) > 0 . (i) If X and Y are independent, then E ( X · Y ) = E ( X ) · E ( Y ) and ρ ( X, Y ) = 0 . (ii) If ρ ( X, Y ) = 1 or, if ρ ( X, Y ) = - 1 , then there exist real numbers a and b such that P ( Y = aX + b ) = 1 . The coefficient a has the same sign as ρ ( X, Y ) . (iii) The mean squared error E ( ( Y - aX - b ) 2 ) of a random variable Y from a linear function aX + b of the random variable X is minimal if a = Cov( X, Y ) Var( X ) and b = E ( Y ) - aE ( X ) . In this case it is given by E ( ( Y - aX - b ) 2 ) = ( 1 - ρ ( X, Y ) ) · Var( Y ) . Proof. The proof of (i) will only be given under the addition assumption that either both random variables are discrete or both are continuous. If X and Y are discrete random variables, then E ( XY ) = ij x i · y j · p ij = ij x i · y j · p i · p j = i x i · p i · j y j · p j = E ( X ) · E ( Y ) . If X and Y are continuously distributed with densities f 1 and f 2 , then by f ( x, y ) = f 1 ( x ) · f 2 ( y ) , ( x, y ) R 2 , a density f of ( X, Y ) is given. Then: E ( XY ) = -∞ -∞ x · y · f 1 ( x ) · f 2 ( y ) d y d x = -∞ x · f 1 ( x )d x · -∞ y · f 2 ( y )d y = E ( X ) · E ( Y ) . Let us now show (iii). We find E ( ( Y - aX - b ) 2 ) = Var( Y - aX - b ) + ( E ( Y - aX - b ) ) 2 = Var( Y ) + a 2 Var( X ) - 2 a Cov( X, Y ) + ( E ( Y ) - aE ( X ) - b ) 2 . 18
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Let us define the function p by p := E ( ( Y - aX - b ) 2 ) , a, b R , i.e. p is polynomial of second order in the variables a and b . By looking at the partial derivatives, the function p has a minimum for a = Cov( X, Y ) Var( X ) and b = E ( Y ) - aE ( X ) .
Image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern