{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

ECE313.Lecture41

# ECE313.Lecture41 - ECE 313 Probability with Engineering...

This preview shows pages 1–8. Sign up to view the full content.

Limit Theorems Professor Dilip V. Sarwate Department of Electrical and Computer Engineering © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign. All Rights Reserved ECE 313 Probability with Engineering Applications

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
ECE 313 - Lecture 41 © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 2 of 39 What are limit theorems? Limit theorems specify the probabilistic behavior of n random variables as n Possible restrictions on RVs: Independent random variables Uncorrelated random variables Have identical marginal CDFs/pdfs/pmfs Have identical means and/or variances
ECE 313 - Lecture 41 © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 3 of 39 The average of n RVs n random variables X 1 , X 2 , …, X n have finite expectations μ 1 , μ 2 , …, μ n Let Z = ( X 1 + X 2 + …+ X n )/n What is E[ Z ]? Expectation is a linear operator E[ Z ] = (E[ X 1 ] + E[ X 2 ] + …+ E[ X n ])/n Expected value of average of n RVs = numerical average of their expectations If E[ X i ] = μ for all i, then E[ Z ] = μ also

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
ECE 313 - Lecture 41 © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 4 of 39 The sample mean Model: An experiment is repeated n times X 1 , X 2 , … X n are the n observed values of a random variable X on the n independent trials of the experiment Random variable X has Fnite mean μ The X i ’s are said to be independent identically distributed (i.i.d. or iid) random variables Z = ( X 1 + X 2 + …+ X n )/n is called the sample mean
ECE 313 - Lecture 41 © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 5 of 39 Variance of the sample mean i.i.d. RVs X i with finite mean and variance Sample mean Z = ( X 1 + X 2 + …+ X n )/n E[ Z ] = E[ X ] = μ var( Z ) = n –2 •var( X 1 + X 2 + …+ X n ) = n –2 •[var( X 1 ) + var( X 2 ) + …+ var( X n )] = n –1 •var( X ) This is because the RVs are independent Hence, cov( X i , X j ) = 0 if i ≠ j

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
ECE 313 - Lecture 41 © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 6 of 39 Variance decreases as n increases Z is the average of the n observed values of a random variable X with mean μ and variance var( X ) E[ Z ] = μ and var( Z ) = n –1 •var( X ) If we wish to estimate the value of μ , then the value of the sample mean Z is a much better estimator than the value of any individual observation X i Application to experimental results
ECE 313 - Lecture 41 © 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 7 of 39 Confidence interval for mean E[ Z ] = μ and var( Z ) = n –1 •var( X ) = n –1 σ 2 Assume var( X ) = σ 2 is known Chebyshev inequality: P{ X – µ ≥ a} ≤ (

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 41

ECE313.Lecture41 - ECE 313 Probability with Engineering...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online