L2handout - PUBH 7430 Lecture 2 Julian Wolfson Division of...

Info iconThis preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: PUBH 7430 Lecture 2 Julian Wolfson Division of Biostatistics University of Minnesota School of Public Health September 8, 2011 Correlation and probability Independence and lack thereof Two events A and B are said to be independent if P ( A occurs and B occurs) = P ( A occurs) × P ( B occurs) Example Let Y 1 and Y 2 be the results (0=heads,1=tails) of flipping a fair coin twice. P (two heads) = P ( Y 1 = 0 , Y 2 = 0) = P ( Y 1 = 0) P ( Y 2 = 0) = 1 / 4 Independence and lack thereof We commonly refer to events A and B as “correlated” if they are not independent : P ( A occurs and B occurs) 6 = P ( A occurs) × P ( B occurs) Example Let Y 1 and Y 2 be the results (0=heads,1=tails) of the following coin-flipping experiment: • Flip the coin once, and record Y 1 as the first flip • If the first flip is a tails, flip the coin again and record Y 2 as the second flip • If the first flip is heads, flip the coin two more times. Record Y 2 as heads if one or two heads come up, and as tails if no heads come up. P (two heads) = P ( Y 1 = 0 , Y 2 = 0) = 1 / 2 × 3 / 4 = 3 / 8 6 = 1 / 4 Consequences of correlation • Many statistics (eg. mean) are based on sums • An important task is to estimate the variance of a given statistic • How does correlation impact variance estimates? Let Y 1 and Y 2 be two random variables (possibly dependent). What is the variance of Y 1 + Y 2 ? Var ( Y 1 + Y 2 ) = Var ( Y 1 )+ Var ( Y 2 )+ 2[ E ( Y 1 Y 2 )- E ( Y 1 ) E ( Y 2 )] Covariance Var ( Y 1 + Y 2 ) = Var ( Y 1 )+ Var ( Y 1 )+ 2[ E ( Y 1 Y 2 )- E ( Y 1 ) E ( Y 2 )] We give the part in red a special name, the covariance : Cov ( X , Y ) = E ( XY )- E ( X ) E ( Y ) Covariance – properties Cov ( X , Y ) = E ( XY )- E ( X ) E ( Y ) ≡ E ( X- E ( X )) E ( Y- E ( Y )) • Cov ( X , X ) = Var ( X ) • Covariance is symmetric : Cov ( X , Y ) = Cov ( Y , X ) • Covariance is unscaled : Cov (2 X , 3 Y ) 6 = Cov ( X , Y ) Covariance – properties Recall: If Y 1 and Y 2 are independent , we have P ( Y 1 = y 1 , Y 2 = y 2 ) = P ( Y 1 = y 1 ) × P ( Y 2 = y 2 ) One can show that independence also implies that E ( Y 1 Y 2 ) = E ( Y 1 ) × E ( Y 2 ), and hence... Cov ( Y 1 , Y 2 ) = E ( Y 1 Y 2 )- E ( Y 1 ) E ( Y 2 ) = E ( Y 1 ) E ( Y 2 )- E ( Y 1 ) E ( Y 2 ) = 0 (Note: Cov ( Y 1 , Y 2 ) = 0 does not necessarily imply that Y 1 and Y 2 are independent) Consequences of correlation More generally, Var ( Y 1 + ··· + Y n ) = n X i =1 Var ( Y i...
View Full Document

This note was uploaded on 11/21/2011 for the course PUBH 7430 taught by Professor Prof.eberly during the Fall '04 term at Minnesota.

Page1 / 23

L2handout - PUBH 7430 Lecture 2 Julian Wolfson Division of...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online