EXPRLS - RLS Algorithm ± RLS gain vector solution to...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
RLS Algorithm: Motivation RLS Algorithm: Motivation ± Least-squares cost-function & solution non-iterative and not amenable to adaptation. ± Desire cost-function & solution to respond to change in signal environment. ± Want to avoid the least-squares prescription of matrix inversion. Recursive Least Squares (RLS) Recursive Least Squares (RLS) ± Cost-function: ± Time-varying ACF and cross-correlation: ± Deterministic Normal Equations: RLS Algorithm RLS Algorithm ± Rank-one update: ± RLS gain vector: ± Inverse update using the matrix inversion lemma: RLS Algorithm
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: RLS Algorithm ± RLS gain vector: solution to linear system: ± Innovations process: ± Tap-weight update: ± Whitening form of update: RLS Algorithm RLS Algorithm ± Recursion for MMSE: ± Conversion factor: ± Tap-weight update weights and smoothes innovations: RLS Algorithm RLS Algorithm ± Forget factor λ weights prior information relative to current information. ± Choice of λ determines speed of adaptation. ± Initialization of the RLS algorithm: ± Whitening approach accounts for an order of magnitude improvement in convergence....
View Full Document

{[ snackBarMessage ]}

Page1 / 2

EXPRLS - RLS Algorithm ± RLS gain vector solution to...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online