lect07 - Lecture Notes 7 Convergence and Limit Theorems...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture Notes 7 Convergence and Limit Theorems Motivation Convergence with Probability 1 Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and Limit Theorems 7 1 Motivation One of the key questions in statistical signal processing is how to estimate the statistics of a r.v., e.g., its mean, variance, distribution, etc. To estimate such a statistic, we collect samples and use an estimator in the form of a sample average How good is the estimator ? Does it converge to the true statistic? How many samples do we need to ensure with some confidence that we are within a certain range of the true value of the statistic? Another key question in statistical signal processing is how to estimate a signal from noisy observations, e.g., using MSE or linear MSE Does the estimator converge to the true signal? The subject of convergence and limit theorems for r.v.s addresses such questions EE 278: Convergence and Limit Theorems 7 2 Example: Estimating the Mean of a R.V. Let X be a r.v. with finite but unknown mean E( X ) To estimate the mean we generate X 1 , X 2 , . . . , X n i.i.d. samples drawn according to the same distribution as X and compute the sample mean S n = 1 n n i =1 X i Does S n converge to E( X ) as we increase n ? If so, how fast? But what does it mean to say that a r.v. sequence S n converges to E( X ) ? First we give an example: Let X 1 , X 2 , . . . , X n be i.i.d. N (0 , 1) We use Matlab to generate 6 sets of outcomes of X 1 , . . . , X n We then plot s n for the 6 sets of outcomes as a function of n Note that each s n sequence appears to be converging to , the mean of the r.v., as n increases EE 278: Convergence and Limit Theorems 7 3 Plots of Sample Sequences of S n 1 2 3 4 5 6 7 8 9 10 ! 2 2 10 20 30 40 50 60 70 80 90 100 ! 0.5 0.5 100 200 300 400 500 600 700 800 900 1000 ! 0.2 0.2 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000 ! 0.05 0.05 n s n s n s n s n EE 278: Convergence and Limit Theorems 7 4 Convergence With Probability 1 Recall that a sequence of numbers x 1 , x 2 , . . . , x n , . . . converges to x if for every > , there exists an n ( ) such that | x n- x | < for every n n ( ) Now consider a sequence of r.v.s X 1 , X 2 , . . . , X n , . . . all defined on the same probability space . For every we obtain a sample sequence (sequence of numbers) X 1 ( ) , X 2 ( ) , . . . , X n ( ) , . . . A sequence X 1 , X 2 , X 3 , . . . of r.v.s is said to converge to random variable X with probability 1 (w.p.1) if P { : lim n X n ( ) = X ( ) } = 1 This means that the set of sample paths that converge to X ( ) , in the sense of a sequence converging to a limit, has probability 1 Equivalently, X 1 , X 2 , . . . , X n , . . . converges w.p.1 if for every > , lim m P {| X n- X | < for every n m } = 1 EE 278: Convergence and Limit Theorems...
View Full Document

Page1 / 11

lect07 - Lecture Notes 7 Convergence and Limit Theorems...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online