# hw5 - EE 562a Homework Set 5 Due Wednesday 21 March 2007...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 562a Homework Set 5 Due Wednesday 21 March 2007 1 (The following well-defined problems come from different sources, and the notation used may vary. Don’t let that bother you!) 1. Recursive Estimation - A Simple Kalman Filter What if we have a sequence of observations, { x ( u, i ) } ∞ i =1 , and we would like to estimate an n- dimensional random vector, v ( u )? Suppose that we know the best estimate of v ( u ) based on the observations { x ( u, i ) } k i =1 and we now observe x ( u, k +1): Do we need start over and solve the new (larger dimensional) estimation problem, or can we somehow update the estimate to account for the new information provided by x ( u, k +1)? This is the subject of this problem. Let v ( u ) be an n-dimensional mean zero, Gaussian random vector. Let the i th observation be the zero mean, Gaussian random variable x ( u, i ) and consider the estimation problem described above. You may assume that v ( u ) and { x ( u, i ) } ∞ i =1 are jointly Gaussian. Denote the ( k × 1) vector of observations by x k ( u ) , x ( u, k ) x ( u, k- 1) . . . x ( u, 1) , and denote the unconstrained MMSE estimate of v ( u ) based on the k observations by ˆ v k ( u ) , E { v ( u ) | x ( u, k ) , x ( u, k- 1) . . . x ( u, 1) } = E { v ( u ) | x k ( u ) } ....
View Full Document

## This note was uploaded on 05/06/2008 for the course EE 562a taught by Professor Toddbrun during the Spring '07 term at USC.

### Page1 / 3

hw5 - EE 562a Homework Set 5 Due Wednesday 21 March 2007...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online