hw5 - EE 562a Homework Set 5 Due Wednesday 21 March 2007 1...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: EE 562a Homework Set 5 Due Wednesday 21 March 2007 1 (The following well-defined problems come from different sources, and the notation used may vary. Dont let that bother you!) 1. Recursive Estimation - A Simple Kalman Filter What if we have a sequence of observations, { x ( u, i ) } i =1 , and we would like to estimate an n- dimensional random vector, v ( u )? Suppose that we know the best estimate of v ( u ) based on the observations { x ( u, i ) } k i =1 and we now observe x ( u, k +1): Do we need start over and solve the new (larger dimensional) estimation problem, or can we somehow update the estimate to account for the new information provided by x ( u, k +1)? This is the subject of this problem. Let v ( u ) be an n-dimensional mean zero, Gaussian random vector. Let the i th observation be the zero mean, Gaussian random variable x ( u, i ) and consider the estimation problem described above. You may assume that v ( u ) and { x ( u, i ) } i =1 are jointly Gaussian. Denote the ( k 1) vector of observations by x k ( u ) , x ( u, k ) x ( u, k- 1) . . . x ( u, 1) , and denote the unconstrained MMSE estimate of v ( u ) based on the k observations by v k ( u ) , E { v ( u ) | x ( u, k ) , x ( u, k- 1) . . . x ( u, 1) } = E { v ( u ) | x k ( u ) } ....
View Full Document

Page1 / 3

hw5 - EE 562a Homework Set 5 Due Wednesday 21 March 2007 1...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online