642:527
The GramSchmidt Algorithm
The context of this discussion is a space of vectors
V
with an inner product
denoted
h
v
,
w
i
. The norm associated to the inner product is denoted by
k
v
k
and is
deﬁned by
k
v
k
=
q
h
v
,
v
i
. Orthogonality and orthonormality are deﬁned relative to
the inner product
h·
,
·i
; that is
u
and
w
are orthogonal if
h
u
,
w
i
= 0 and they are
orthonormal if they are orthogonal and
k
u
k
=1,
k
w
k
=1.
The problem we address is this: given a set
{
v
1
,...,
v
k
}
of
k
linearly independent
vectors in
V
, ﬁnd an orthonormal set of vectors
{
ˆ
e
1
,...,
ˆ
e
k
}
such that for every
j
,
1
≤
j
≤
k
,
Span
{
ˆ
e
1
,...,
ˆ
e
j
}
= Span
{
v
1
,...,
v
j
}
.
for all
j
= 1
,
2
,...,k
.
(1)
Recall that this equality of spans means any linear combination of the vectors
v
1
,...,
v
j
can be written as a linear combination of the vectors ˆ
e
1
,...,
ˆ
e
j
and viceversa.
To understand the GramSchmidt algorithm one needs to remember the following
basic fact. If
{
u
1
,...,
u
n
}
is a set of orthogonal vectors and
v
is another vector, then
n
X
i
=1
h
v
,
u
i
i
h
u
i
,
u
1
i
u
i
is the projection of
v
onto Span
{
u
1
,...,
u
n
}
, and
(2)
v

n
X
i
=1
h
v
,
u
i
i
h
u
i
,
u
1
i
u
i
is orthogonal to every vector in Span
{
u
1
,...,
u
n
}
.
(3)
These two statements are really the same, because the projection of
v
onto Span
{
u
1
,...,
u
k
}
is deﬁned precisely by condition (3).
Here is the algorithm for a given set