This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE378 Statistical Signal Processing Lecture 3  04/11/2007 MeanSquare Ergodicity, WideSense Ergodicity, Martingales Lecturer: Tsachy Weissman Scribe: Sudeepto Chakraborty, Lei Zhao, Kamakshi S 1 MeanSquare Ergodicity Definition 1. A WSS process { X ( n ) } (with E [ X ( n )] = μ and r ( k ) = Cov ( X ( n + k ) ,X ( n )) ) is 1. Meansquare ergodic in the 1 st moment if lim N →∞ E 1 2 N + 1 N X n = N X ( n ) μ ! 2 = 0 , (1) 2. Meansquare ergodic in the 2 nd moment if ∀ k, lim N →∞ E 1 2 N + 1 N X n = N ( X ( n ) μ )( X ( n k ) μ ) r ( k ) ! 2 = 0 (2) 3. A WSS process is widesense ergodic if it is meansquare ergodic in the 1 st and the 2 nd moments. Example 1 { X ( n ) } is an iid process with the following distribution: X ( n ) ∼ 1 w.p. 1 / 2 1 w.p. 1 / 2 Verify that this process is widesense ergodic. Example 2 { X ( n ) } is a special case of a DC process: X ( n ) ∼ 1 ∀ n w.p. 1 / 2 1 ∀ n w.p. 1 / 2 E 1 2 N + 1 N X n = N X ( n ) ! 2 = 1 6→ Therefore X ( n ) is not ergodic in 1 st moment ⇒ not widesense ergodic. In reference to Theorem 9 in lecture notes 2, this is an example of a mixture of ergodic processes (namely, a mixture of degenerate processes for which all the components are equal to the same deterministic constant). These rather extreme examples make the point that, if process “memory” is sufficiently short you will have Mean Square Ergodicity. On the other hand, if the “memory” is too long, the process will not be mean square ergodic. The rate of decay of the covariance sequence can be thought of as a measure of memory....
View
Full
Document
 Spring '07
 Weissman,I
 Normal Distribution, Signal Processing, Probability theory, martingales, meansquare ergodicity, covariance sequence

Click to edit the document details