This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Law of Large Numbers 10/27/2005 • An intuitive way to view the probability of a certain outcome is the frequency with which that outcome occurs in the long run. • We defined probability mathematically as a value of a distribution function for the random variable representing the experiment. • The Law of Large Numbers shows that this model is consistent with the frequency interpretation of probability. 1 Chebyshev Inequality Theorem. Let X be a discrete random variable with expected value μ = E ( X ) , and let > be any positive real number. Then P (  X μ  ≥ ) ≤ V ( X ) 2 . Proof. Let m ( x ) denote the distribution function of X . P (  X μ  ≥ ) = X  x μ ≥ m ( x ) . P (  X μ  ≥ ) = X  x μ ≥ m ( x ) . 2 2 Example • Let X by any random variable with E ( X ) = μ and V ( X ) = σ 2 . • Then, if = kσ , Chebyshev’s Inequality states that P (  X μ  ≥ kσ ) ≤ σ 2 k 2 σ 2 = 1 k 2 ....
View
Full
Document
This note was uploaded on 07/16/2010 for the course MATH 20 taught by Professor Ionescu during the Fall '05 term at Dartmouth.
 Fall '05
 Ionescu
 Law Of Large Numbers, Probability

Click to edit the document details