lecture19

lecture19 - Stat 302, Introduction to Probability Jiahua...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 302, Introduction to Probability Jiahua Chen January-April 2011 Jiahua Chen () Lecture 19 January-April 2011 1 / 24 What LLNs do not answer Let X 1 , X 2 , . . . be a sequence of iid random variables with mean and variance 2 . Denote X n = n- 1 ( X + 1 + X 2 + + X n ) . According to WLLN, lim n P ( | X n- | ) = 0 for any > 0. According to SLLN, P ( lim n X n = ) = 1. Both Laws tell us that X n . Yet how close are they? Jiahua Chen () Lecture 19 January-April 2011 2 / 24 More detailed error bound By Chebyshevs inequality, we have P ( | X n- | ) 2 n 2 . Letting = n- 1 / 3 , it becomes P ( | X n- | n- 1 / 3 ) 2 n 1 / 3 . Note the limit of the upper bound is still 0 as n . Jiahua Chen () Lecture 19 January-April 2011 3 / 24 More detailed error bound By Chebyshevs inequality, we have P ( | X n- | ) 2 n 2 . Letting = n- 1 / 3 , it becomes P ( | X n- | n- 1 / 3 ) 2 n 1 / 3 . Note the limit of the upper bound is still 0 as n . That is, the precision for X n as an estimator of is within a range of n- 1 / 3 . Jiahua Chen () Lecture 19 January-April 2011 3 / 24 More detailed error bound Keep the inequality here again: P ( | X n- | ) 2 n 2 . Letting = n- 2 / 3 , it becomes P ( | X n- | n- 2 / 3 ) n 1 / 3 2 . Note the limit is as n . Jiahua Chen () Lecture 19 January-April 2011 4 / 24 More detailed error bound Keep the inequality here again: P ( | X n- | ) 2 n 2 . Letting = n- 2 / 3 , it becomes P ( | X n- | n- 2 / 3 ) n 1 / 3 2 . Note the limit is as n . The largest possible probability is 1, so the infinity upper bound is not useful. Yet it does tell us that the difference | X n- | will exceed n- 2 / 3 . Jiahua Chen () Lecture 19 January-April 2011 4 / 24 Tossing a die 6000 times. These analyses should reveal that n- 2 / 3 < | X n- | < n- 1 / 3 . When n = 6000, it becomes: 0.003 < | X n- | < 0.055. In general, it is easy to see that n- 1 / 2 gives more tight bounds: P ( n | X n- | 3 ) 1 9 . That is, with 89% probability, for this die tossing example, | X n- | < 0.01443376. Jiahua Chen () Lecture 19 January-April 2011 5 / 24 Tossing a die 6000 times: exact value We may directly compute P ( | X n- | > 0.01443376 ) = 0.0030543 = 0.3% which is much lower than the Chebyshev upper bound: 11%....
View Full Document

Page1 / 30

lecture19 - Stat 302, Introduction to Probability Jiahua...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online