This preview shows pages 1–5. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 1 Probability Inequalities: Markov inequality: X is nonnegative r.v., then for any a>0: Chebyshev inequality: If X is an r.v. with mean and variance 2, then Important trick: If we cannot calculate variance of a finite ranged r.v (it takes values in the range [a,b]), then we can use the bound V[X] <= (ba) 2 /4 . ] [ ) ( P a X E a X . )  ( P 2 2 c c X IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 2 Limit Theorems Link statistics and probability theory. Why descriptive statistics such as sample mean are meaningful to look at. Why relative frequencies are good estimates of true probabilities. IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 3 Law of Large Numbers (LLN): Let X 1 , , X n be independent identically distributed random variables with mean . W L L N : For every >0, we have: S L L N : Sample mean converges to true mean , with probability 1 (almost surely) : + + = n as )  n X ... X  ( P )  M  ( P n 1 n 1 ) n X ... X l ( P n 1 n = = + + im IE 300/GE 331 Lecture 19 Negar Kiyavash, UIUC 4 What does LLN imply? W L L N : Means that sample mean is a good estimator of the true mean in the sense that with enough samples it falls with high probability into an neighborhood of the true mean! S L L N : Says everything that WLLN says as WLLN follows if SSL holds and more....
View Full
Document
 Spring '09
 NegarKayavash

Click to edit the document details