This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Least Squares : Motivation
Least Least-squares Algorithm
Least SDA/LMS assume a probabilistic model underlying the
optimal filtering problem. Deterministic cost-function: SDA/LMS assume access to ensemble statistics and
multiple realizations. Deterministic orthogonality principle: SDA/LMS assume ergodicity in the absence of
multiple realizations. Deterministic normal equations: SDA/LMS speed of convergence tied up with
eigenvalue spread of Ruu. Least Squares Algorithm
Optimal solution requires matrix inversion: Data matrix and desired signal vector: Least--squares Algorithm
Least-squares recipe: SVD solution to
Achievable MMSE: Alternative form of MMSE:
Optimal solution in terms of data matrix: Regularized least-squares solution: Properties: Least-squares
Linear regression model: If measurement error is zero-mean, white wls is
If measurement error is zero-mean, white covariance of
wls is given by:
If error is further Gaussian, wls is MVUE. ...
View Full Document
This note was uploaded on 12/02/2011 for the course AR 107 taught by Professor Gracegraham during the Fall '11 term at Montgomery College.
- Fall '11