Unformatted text preview: σ 2 w . (a) Find the initial least squares estimator of the state vector at time t = 2, in terms of the observations x 1 and x 2 . (b) If σ 2 w = 0, so that we have ordinary linear regression with constant coeﬃceints and a third observation x 3 becomes available, apply the Kalman ﬁlter to show that the estimator of the state vector at time t = 3 is given by [ˆ μ 3 , ˆ β 3 ] = ± 5 6 x 3 + 1 3 x 2 ± 1 6 x 1 , 1 2 ( x 3 ± x 1 ) ² . 3. (1 point each) Consider AR(2) process, X t = φ 1 X t1 + φ 2 X t2 + W t , W t ∼ N (0 ,σ 2 ) . (a) Find a state space representation based on the state vector θ t = ( X t ,X t1 ). (b) Find a state space representation based on the state vector θ t = ( X t , ˆ X t ), where ˆ X t is the optimal one step ahead predictor at time t ....
View
Full Document
 Spring '08
 MACKGALLOWAY
 Least Squares, Regression Analysis, Variance, Estimation theory, Kalman filter

Click to edit the document details