This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Stat 302, Introduction to Probability Jiahua Chen JanuaryApril 2011 Jiahua Chen () Lecture 19 JanuaryApril 2011 1 / 24 What LLNs do not answer Let X 1 , X 2 , . . . be a sequence of iid random variables with mean μ and variance σ 2 . Denote ¯ X n = n 1 ( X + 1 + X 2 + ··· + X n ) . According to WLLN, lim n → ∞ P (  X n μ  ≥ ǫ ) = 0 for any ǫ > 0. According to SLLN, P ( lim n → ∞ ¯ X n = μ ) = 1. Both Laws tell us that ¯ X n ≈ μ . Yet how close are they? Jiahua Chen () Lecture 19 JanuaryApril 2011 2 / 24 More detailed error bound By Chebyshev’s inequality, we have P (  ¯ X n μ  ≥ ǫ ) ≤ σ 2 n ǫ 2 . Letting ǫ = n 1 / 3 , it becomes P (  ¯ X n μ  ≥ n 1 / 3 ) ≤ σ 2 n 1 / 3 . Note the limit of the upper bound is still 0 as n → ∞ . Jiahua Chen () Lecture 19 JanuaryApril 2011 3 / 24 More detailed error bound By Chebyshev’s inequality, we have P (  ¯ X n μ  ≥ ǫ ) ≤ σ 2 n ǫ 2 . Letting ǫ = n 1 / 3 , it becomes P (  ¯ X n μ  ≥ n 1 / 3 ) ≤ σ 2 n 1 / 3 . Note the limit of the upper bound is still 0 as n → ∞ . That is, the precision for ¯ X n as an estimator of μ is within a range of n 1 / 3 . Jiahua Chen () Lecture 19 JanuaryApril 2011 3 / 24 More detailed error bound Keep the inequality here again: P (  ¯ X n μ  ≥ ǫ ) ≤ σ 2 n ǫ 2 . Letting ǫ = n 2 / 3 , it becomes P (  ¯ X n μ  ≥ n 2 / 3 ) ≤ n 1 / 3 σ 2 . Note the limit is ∞ as n → ∞ . Jiahua Chen () Lecture 19 JanuaryApril 2011 4 / 24 More detailed error bound Keep the inequality here again: P (  ¯ X n μ  ≥ ǫ ) ≤ σ 2 n ǫ 2 . Letting ǫ = n 2 / 3 , it becomes P (  ¯ X n μ  ≥ n 2 / 3 ) ≤ n 1 / 3 σ 2 . Note the limit is ∞ as n → ∞ . The largest possible probability is 1, so the infinity upper bound is not useful. Yet it does tell us that the difference  ¯ X n μ  will exceed n 2 / 3 . Jiahua Chen () Lecture 19 JanuaryApril 2011 4 / 24 Tossing a die 6000 times. These analyses should reveal that n 2 / 3 <  ¯ X n μ  < n 1 / 3 . When n = 6000, it becomes: 0.003 <  ¯ X n μ  < 0.055. In general, it is easy to see that n 1 / 2 gives more tight bounds: P ( √ n  ¯ X n μ  σ ≥ 3 ) ≤ 1 9 . That is, with 89% probability, for this die tossing example,  ¯ X n μ  < 0.01443376. Jiahua Chen () Lecture 19 JanuaryApril 2011 5 / 24 Tossing a die 6000 times: exact value We may directly compute P (  ¯ X n μ  > 0.01443376 ) = 0.0030543 = 0.3% which is much lower than the Chebyshev upper bound: 11%....
View
Full Document
 Spring '11
 Dr.Chen
 Normal Distribution, Probability, Variance, Probability theory, Jiahua Chen

Click to edit the document details