{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# hw4 - EE 378 Handout#5 Statistical Signal Processing...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 378 Handout #5 Statistical Signal Processing Wednesday, May 2, 2007 Homework Set #4 Due: Thursday, May 10, 2007. Announcement: You can hand in the HW either after class or deposit it, before 5pm, in the Homework in box in the 378 drawer of the class file cabinet on the second floor of the Packard Building. 1. Consider the process X ( n ) defined by X ( n ) = − aX ( n − 1) + W ( n ) , where W ( n ) is a zero-mean white noise process with unit variance, and | a | < 1. (a) Find the optimal linear predictor of X ( n ) based on X ( n − 1). (b) Find the optimal linear predictor of X ( n ) based on X ( n − 1) and X ( n − 2). (c) Is it the case that W ( n ) = X ( n ) − ˆ X o pt ( n ), where ˆ X o pt ( n ) denote the optimal linear predictor of X ( n ) based on its (whole) past? (d) Suppose Y ( n ) is WSS with innovation process W ( n ), and that the optimal linear predictor of Y ( n ) based on Y ( n − 1) and Y ( n − 2) does not depend on Y ( n − 2). Does it hold that W ( n ) is Y ( n ) minus the optimal linear predictor of Y ( n ) based on Y ( n − 1)? 2. Let S ∼ Bernoulli(1 / 2), A be a zero-mean random variable with variance σ 2 A , and W ( n ) be a zero-mean white noise process with variance σ 2 W > 0. Assume S , A and { W ( n ) } are statistically independent and define the process X ( n ) as X ( n ) = braceleftbigg W ( n ) , n + S even A, n + S odd....
View Full Document

{[ snackBarMessage ]}

### Page1 / 4

hw4 - EE 378 Handout#5 Statistical Signal Processing...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online