Linear Least Squares Computations
Assuming that
A
∈
R
m
×
n
has linearly independent columns, the problem
arg min
x
k
Ax

b
k
2
(1)
has a unique solution, say
x
LS
, which is also the unique solution to the normal
equations
A
t
Ax
=
A
t
b.
(2)
This suggests the
normal equations approach
to computing
x
LS
:
1. Form
C
=
A
t
A
, and
w
=
A
t
b
.
2. Compute the Cholesky factorization
C
=
LL
t
.
3. Solve
Ly
=
w
by forsub and then
L
t
x
=
y
by backsub.
This algorithm requires
mn
2
+
1
3
n
3
+
O
(
mn
) ﬂops (taking advantage of the
symmetry of
C
). It is an important method because it is fast and doesn’t use very
much memory.
Cx
=
w
can be viewed as a compressed form of arg min
x
k
Ax

b
k
2
.
We have other methods that, while more costly, are more robust in the face of
rounding errors. The other methods arrive at
x
LS
by a diﬀerent route. Recall that
the normal equations were a result of requiring that
b

Ax
be orthogonal (normal)
to the subspace
S
= ColSp(
A
). That is another way of saying that
Ax
is the
orthogonal projection of
b
onto
S
. The solution to the normal equations is therefore
This is the end of the preview. Sign up
to
access the rest of the document.
This note was uploaded on 12/18/2010 for the course PHYS 5073 taught by Professor Mark during the Fall '10 term at Arkansas.
 Fall '10
 MARK

Click to edit the document details