RLS - Least Squares : Motivation Least Least-squares...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Least Squares : Motivation Least Least-squares Algorithm Least SDA/LMS assume a probabilistic model underlying the optimal filtering problem. Deterministic cost-function: SDA/LMS assume access to ensemble statistics and multiple realizations. Deterministic orthogonality principle: SDA/LMS assume ergodicity in the absence of multiple realizations. Deterministic normal equations: SDA/LMS speed of convergence tied up with eigenvalue spread of Ruu. Least Squares Algorithm Least Optimal solution requires matrix inversion: Data matrix and desired signal vector: Least--squares Algorithm Least Least-squares recipe: SVD solution to Achievable MMSE: Alternative form of MMSE: Optimal solution in terms of data matrix: Regularized least-squares solution: Properties: Least-squares Properties: Linear regression model: If measurement error is zero-mean, white wls is unbiased: If measurement error is zero-mean, white covariance of wls is given by: If error is further Gaussian, wls is MVUE. ...
View Full Document

This note was uploaded on 12/02/2011 for the course AR 107 taught by Professor Gracegraham during the Fall '11 term at Montgomery College.

Page1 / 2

RLS - Least Squares : Motivation Least Least-squares...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online