This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 378 Handout #5 Statistical Signal Processing Wednesday, May 2, 2007 Homework Set #4 Due: Thursday, May 10, 2007. Announcement: You can hand in the HW either after class or deposit it, before 5pm, in the Homework in box in the 378 drawer of the class file cabinet on the second floor of the Packard Building. 1. Consider the process X ( n ) defined by X ( n ) = − aX ( n − 1) + W ( n ) , where W ( n ) is a zeromean white noise process with unit variance, and  a  < 1. (a) Find the optimal linear predictor of X ( n ) based on X ( n − 1). (b) Find the optimal linear predictor of X ( n ) based on X ( n − 1) and X ( n − 2). (c) Is it the case that W ( n ) = X ( n ) − ˆ X o pt ( n ), where ˆ X o pt ( n ) denote the optimal linear predictor of X ( n ) based on its (whole) past? (d) Suppose Y ( n ) is WSS with innovation process W ( n ), and that the optimal linear predictor of Y ( n ) based on Y ( n − 1) and Y ( n − 2) does not depend on Y ( n − 2). Does it hold that W ( n ) is Y ( n ) minus the optimal linear predictor of Y ( n ) based on Y ( n − 1)? 2. Let S ∼ Bernoulli(1 / 2), A be a zeromean random variable with variance σ 2 A , and W ( n ) be a zeromean white noise process with variance σ 2 W > 0. Assume S , A and { W ( n ) } are statistically independent and define the process X ( n ) as X ( n ) = braceleftbigg W ( n ) , n + S even A, n + S odd....
View
Full Document
 Spring '07
 Weissman,I
 Bernoulli, Signal Processing, Autocorrelation, 1 W, 2 w, 0.8w

Click to edit the document details