lsq - On Least Squares Inversion A problem of importance...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
On Least Squares Inversion A problem of importance that we will see appear often in optimal estimation is the concept of least-squares inversion. Here we will look at the basic idea behind the least-squares estimation hoopla. Consider a system of m linear equations in n unknowns: Ax = b . Our goal is to look at two specific situations of interest: 1. rank( A ) = m , i.e., A is of full row-rank. 2. rank( A ) = n , i.e., A is of full column-rank. Let us first look at the case where the system has full row-rank. The rectangular system described above does not have a unique solution because b / range( A ). If the matrix A is of full row-rank then the matrix AA H is self-adjoint and furthermore: rank( AA H ) = rank( A ) = m. Upon making the substitution x = A H y we obtain a modified system of linear equations: AA H y = b . Since the matrix AA H is invertible we can obtain the solution to the modified system as: y = ( AA H ) - 1 b . Substituting this solution back into the original system we obtain:
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/02/2011 for the course AR 107 taught by Professor Gracegraham during the Fall '11 term at Montgomery College.

Page1 / 2

lsq - On Least Squares Inversion A problem of importance...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online