reasoning applied to A T produces the dual result: The
left nullspace N(A T ) and the column space C(A) are
orthogonal complements. Their dimensions add up to (m
r) + r = m, This completes the second half of the
fundamental theorem of linear algebra. The
Orthogonality The entries ri j = q T i aj appear in formula
(11), when kAjkqj is substituted for Aj : aj = (q T 1 aj)q1
+ (q T j1aj)qj1 +kAjkqj = Q times column j of R. (13)
3U Every m by n matrix with independent columns can be
factored into A = QR. The
of the columns of Q = I: Standard basis e1 =
1 0 0 . . . 0 , e2 = 0 1 0 . . . 0
, , en = 0 0 0 . . . 1
. That is not the only orthonormal basis! We can
rotate the axes without changing the right angles at
which they meet. These rotation matrices will
1/ 6 . Rectangular Matrices with Orthogonal
Columns This chapter is about Ax = b, when A is not
necessarily square. For Qx = b we now admit the same
possibilitythere may be more rows than columns. The n
orthonormal 3.4 Orthogonal Bases and Gram-Schmidt
19
natural to think of the space R . It contains all vectors v
= (v1,v2,v3,.) with an infinite sequence of components.
This space 3.4 Orthogonal Bases and Gram-Schmidt 205
is actually too big when there is no control on the size of
components v j . A much be
real part 1 2/2+ is cos. The imaginary part
3/6+ is the sine, The formula is correct, and I wish we
had sent a more beautiful proof. With this formula, we
can solve w n = 1. It becomes e in = 1, so that n must
carry us around the unit circle and back to
and xb2 = 1. The error in the equation 0x1 +0x2 = 6 is
sure to be 6. Remark 4. Suppose b is actually in the
column space of Ait is a combination b = Ax of the
columns. Then the projection of b is still b: b in column
space p = A(A TA) 1A TAx = Ax = b. The
without using square roots. Notice the 1 2 from a Tb/a Ta
instead of 1 2 from q Tb: B = 1 0 0 1 2 1
0 1 and then C = 2 1 0 1 0 1 2
1 2 0 1 2 . The Factorization A = QR We
started with a matrix A, whose columns were a, b, c. We
ended with a matrix Q, whos
Then w 8 = i 4 = 1. There has to be a system here. The
complex numbers cos +isin in the Fourier matrix are
extremely special. The real part is plotted on the x-axis
and the imaginary part on the y-axis (Figure 3.11). Then
the number w lies on the unit cir
a, with cos = Op Ob = a Tb kakkbk . the coefficient xb. All
we need is the geometrical fact that the line from b to
the closest point p = xab is perpendicular to the vector a:
(bab)a, or a T (bab) = 0, or xb= a Tb a Ta . (4) That
gives the formula for the
(the dotted line in Figure 3.5) is perpendicular to a. This
fact will allow us to find the projection p. Even though a
and b are not orthogonal, the distance problem
automatically brings in orthogonality. The situation is the
same when we are given a plan
columns. In fact, b is Axr , with xr in the row space, since
the nullspace component gives Axn = 0, If another vector
x 0 r in the row space gives Ax0 r = b, then A(xr x 0 r ) =
bb = 0. This puts xr x 0 r in the nullspace and the row
space, which makes it
makes V orthogonal to Z. 20. Let S be a subspace of R n .
Explain what (S ) = S means and why it is true. 21.
Let P be the plane in R 2 with equation x +2yz = 0. Find a
vector perpendicular to P. What matrix has the plane P as
its nullspace, and what matr
onto the column space. The error vector e = bAxbmust
be perpendicular to that space (Figure 3.8). Finding xb
and the projection p = Axbis so fundamental that we do it
in two ways: 1. All vectors perpendicular to the column
space lie in the left nullspace.
matrix aaT/a Ta is the same if a is doubled: a = 2 2 2
gives P = 1 12 2 2 2 h 2 2 2 i = 1 3 1 3
1 3 1 3 1 3 1 3 1 3 1 3 1 3 as before. The line through
a is the same, and thats all the projection matrix cares
about. If a has unit length, the denominator
and they are the components of the dashed vector in
Figure 3.9b. This error vector is orthogonal to the first
column (1,1,1), since 2 7 6 7 + 4 7 = 0. It is orthogonal
to the second column (1,1,2), because 2 7 6 7 + 8 7 =
0. It is orthogonal to the column
invertible. If it is 4 by 4, then its four columns are
independent and its column space is all of R 4 . What is
the projection onto the whole space? It is the identity
matrix. P = A(A TA) 1A T = AA1 (A T ) 1A T = I. (5) The
identity matrix is symmetric, I
2 are perpendicular. (Their inner product is always
positive, because it is the integral of x 2 .) Therefore the
closest parabola to f(x) is not the sum of its projections
onto 1, x, and x 2 . There will be a matrix like (A TA) 1 ,
and this coupling is gi
PC = A(A TA) 1A T is the projection onto the column
space of A, what is the projection PR onto the row space?
(It is not P T C !) 3.3 Projections and Least Squares 193
20. If P is the projection onto the column space of A,
what is the projection onto the
does this leave? Verify that the rows automatically
become orthonormal at the same time. 7. Show, by
forming b Tb directly, that Pythagorass law holds for any
combination b = x1q1 +xnqn of orthonormal vectors:
kbk 2 = x 2 1 +x 2 n . In matrix terms, b = Q
point in the plane onto the line x+2y = 0. 13. Prove that
the trace of P = aaT/a Tawhich is the sum of its diagonal
entries always equals 1. 14. What matrix P projects
every point in R 3 onto the line of intersection of the
planes x+y+t = 0 and xt = 0? 15
matrix. (b) If Q (3 by 2) has orthonormal columns then
kQxk always equals kxk. 32. (a) Find a basis for the
subspace S in R 4 spanned by all solutions of x1 +x2 +x3
x4 = 0. (b) Find a basis for the orthogonal complement S
. (c) Find b1 in S and b2 in S so
(a) Verify that the best line goes through the center point
(bt,bb) = (2,9). (b) Explain why C +Dbt = bb comes from
the first equation in A TAxb= A Tb. 38. What happens to
the weighted average xbW = (w 2 1 b1 + w 2 2 b2)/(w 2 1
+ w 2 2 ) if the first weig
finding that line, and if you compare them the whole
chapter might become clear! 1. Solve [1 x] C D = x 5
by least squares. The equation A TAxb= A Tb is " (1,1)
(1,x) (x,1) (x,x) #"C D # = " (1,x 5 ) (x,x 5 ) # or " 1 1 2 1 2 1
3 #" C D # = " 1 6 1 17# .
dimensions can be too small. The line V spanned by
(0,1,0) is orthogonal to the line W spanned by (0,0,1), but
V is not W. The orthogonal complement of W is a
twodimensional plane, and the line is only part of W.
When the dimensions are right, orthogonal
from these vectors (mutually orthogonal unit vectors). 7.
Find a vector x orthogonal to the row space of A, and a
vector y orthogonal to the column space, and a vector z
orthogonal to the nullspace: A = 1 2 1 2 4 3 3 6 4
. 8. If V and W are orthogonal su
contain these eigenvectors x and z? Symmetric
matrices have perpendicular eigenvectors (see Section
5.5). 32. (Recommended) Draw Figure 3.4 to show each
subspace for A = " 1 2 3 6# and B = " 1 0 3 0# . 170
Chapter 3 Orthogonality 33. Find the pieces xr an
subspace S is contained in a subspace V, prove that S
contains V . Problems 4550 are about perpendicular
columns and rows. 3.2 Cosines and Projections onto Lines
171 45. Suppose an n by n matrix is invertible: AA1 = I.
Then the first column of A 1 is ort
new meaning when the length involves W. The weighted
length of x equals the ordinary length of Wx.
Perpendicularity no longer means y T x = 0; in the new
system the test is (Wy) T (Wx) = 0. The matrix WTW
appears in the middle. In this new sense, the proj
2 1 +a 2 m), that geometry did earlier: 3.3 Projections
and Least Squares 181 3K The least-squares solution to a
problem ax = b in one unknown is xb= a Tb a Ta . You see
that we keep coming back to the geometrical
interpretation of a least-squares problem