This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 1 Probability Inequalities: • Markov inequality: X is nonnegative r.v., then for any a>0: • Chebyshev inequality: If X is an r.v. with mean μ and variance σ 2, then Important trick: If we cannot calculate variance of a finite ranged r.v (it takes values in the range [a,b]), then we can use the bound V[X] <= (ba) 2 /4 . ] [ ) ( P a X E a X ≤ ≥ . )  ( P 2 2 c c X σ μ ≤ ≥ − IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 2 Limit Theorems • Link statistics and probability theory. • Why descriptive statistics such as sample mean are meaningful to look at. • Why relative frequencies are good estimates of true probabilities. IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 3 Law of Large Numbers (LLN): • Let X 1 , … , X n be independent identically distributed random variables with mean μ . • W L L N : For every є >0, we have: • S L L N : Sample mean converges to true mean μ , with probability 1 (almost surely) : ∞ → → + + = ≥ n as )  n X ... X  ( P )  M  ( P n 1 n μ ε μ 1 ) n X ... X l ( P n 1 n = = + + ∞ → μ im IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 4 What does LLN imply? • W L L N : Means that sample mean is a good estimator of the true mean in the sense that with enough samples it falls with high probability into an є neighborhood of the true mean! • S L L N : Says everything that WLLN says as WLLN follows if SSL holds and more....
View
Full
Document
This note was uploaded on 09/08/2009 for the course GE 331 taught by Professor Negarkayavash during the Spring '09 term at University of Illinois at Urbana–Champaign.
 Spring '09
 NegarKayavash

Click to edit the document details