NLMS - ± Normalized step-size: NLMS Algorithm NLMS...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
NLMS Algorithm: Motivation NLMS Algorithm: Motivation ± Want tap-weight update to reflect changes in statistics of u[n], specifically average power. ± Want convergence of the algorithm to be relatively independent of χ (R). ± Want to retain the steepest descent flavor of the LMS algorithm. NLMS Algorithm NLMS Algorithm ± Cost function: ± Gradient w.r.s.t. w[n+1] : ± Gradient w.r.s.t. λ NLMS Algorithm NLMS Algorithm ± Tap-weight update: ± Variable step-size: ± Norm update:
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ± Normalized step-size: NLMS Algorithm NLMS Algorithm ± Off-set parameter δ used to avoid divide by zero problems. ± Direction of tap-weight update still in the direction of steepest descent. ± Normalization of step-size removes sensitivity to eigenvalue spread χ (R). ± Convergence characteristics superior to the LMS. NLMS : Step NLMS : Step-size Analysis size Analysis ± Mean-square stability: ± Optimal step-size:...
View Full Document

This note was uploaded on 12/02/2011 for the course AR 107 taught by Professor Gracegraham during the Fall '11 term at Montgomery College.

Page1 / 2

NLMS - ± Normalized step-size: NLMS Algorithm NLMS...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online