This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: EE 562a Homework Set 5 Due Wednesday 21 March 2007 1 (The following welldefined problems come from different sources, and the notation used may vary. Dont let that bother you!) 1. Recursive Estimation  A Simple Kalman Filter What if we have a sequence of observations, { x ( u, i ) } i =1 , and we would like to estimate an n dimensional random vector, v ( u )? Suppose that we know the best estimate of v ( u ) based on the observations { x ( u, i ) } k i =1 and we now observe x ( u, k +1): Do we need start over and solve the new (larger dimensional) estimation problem, or can we somehow update the estimate to account for the new information provided by x ( u, k +1)? This is the subject of this problem. Let v ( u ) be an ndimensional mean zero, Gaussian random vector. Let the i th observation be the zero mean, Gaussian random variable x ( u, i ) and consider the estimation problem described above. You may assume that v ( u ) and { x ( u, i ) } i =1 are jointly Gaussian. Denote the ( k 1) vector of observations by x k ( u ) , x ( u, k ) x ( u, k 1) . . . x ( u, 1) , and denote the unconstrained MMSE estimate of v ( u ) based on the k observations by v k ( u ) , E { v ( u )  x ( u, k ) , x ( u, k 1) . . . x ( u, 1) } = E { v ( u )  x k ( u ) } ....
View Full
Document
 Spring '07
 ToddBrun

Click to edit the document details