# lsq - On Least Squares Inversion A problem of importance...

This preview shows pages 1–2. Sign up to view the full content.

On Least Squares Inversion A problem of importance that we will see appear often in optimal estimation is the concept of least-squares inversion. Here we will look at the basic idea behind the least-squares estimation hoopla. Consider a system of m linear equations in n unknowns: Ax = b . Our goal is to look at two speciﬁc situations of interest: 1. rank( A ) = m , i.e., A is of full row-rank. 2. rank( A ) = n , i.e., A is of full column-rank. Let us ﬁrst look at the case where the system has full row-rank. The rectangular system described above does not have a unique solution because b / range( A ). If the matrix A is of full row-rank then the matrix AA H is self-adjoint and furthermore: rank( AA H ) = rank( A ) = m. Upon making the substitution x = A H y we obtain a modiﬁed system of linear equations: AA H y = b . Since the matrix AA H is invertible we can obtain the solution to the modiﬁed system as: y = ( AA H ) - 1 b . Substituting this solution back into the original system we obtain:

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 12/02/2011 for the course AR 107 taught by Professor Gracegraham during the Fall '11 term at Montgomery College.

### Page1 / 2

lsq - On Least Squares Inversion A problem of importance...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online