In conclusion: The kernel ofLconsists of all symmetric matrices, and the image consistsof all skew-symmetric matrices.61. Note that the first three matrices of the given basisBare symmetric, so thatL(A) =A−AT= 0, and the coordinate vector [L(A)]Bis 0 for all three of them. The last matrixof the basis is skew-symmetric, so thatL(A) = 2A, and [L(A)]B= 2e4. Using Fact 4.3.2,we find that theB-matrix ofLis0000000000000002.63. By Exercise 2.4.62b, the given LDU factorization ofAis unique.By Fact 5.3.9a,A=AT= (LDU)T=UTDTLT=UTDLTis another way to write theLDU factorization ofA(sinceUTis lower triangular andLTis upper triangular). By theuniqueness of the LDU factorization, we haveU=LT(andL=UT), as claimed.65. Write 10A=abcd; it is required thata, b, canddbe integers. NowA=a10b10c10d10must be an orthogonal matrix, implying that (a10)2+(c10)2= 1, ora2+c2= 100. Checking135
Chapter 5SSM:Linear Algebrathe squares of all integers from 1 to 9, we see that there are only two ways to write 100 asa sum of two positive perfect squares: 100 = 36+64 = 64+36. Sinceaandcare requiredto be positive, we have eithera= 6 andc= 8 ora= 8 andc= 6. In each case we havetwo options for the second column ofA, namely, the two unit vectors perpendicular tothe first column vector. Thus we end up with four solutions:A=.6−.8.8.6,.6.8.8−.6,.8−.6.6.8or.8.6.6−.8.67. a. We need to show thatATAc=ATx, or, equivalently, thatAT(x−Ac) = 0.ButAT(x−Ac) =AT(x−c1v1− · · · −cmvm) is the vector whoseithcomponent is(vi)T(x−c1v1− · · · −cmvm) =vi·(x−c1v1− · · · −cmvm), which we know to be zero.b. The systemATAc=ATxhas a unique solutioncfor a givenx, sincecis the coordinatevector of projVxwith respect to the basisv1, . . . , vm. Thus the coeﬃcient matrixATAmust be invertible, so that we can solve forcand writec= (ATA)−1ATx.ThenprojVx=c1v1+· · ·+cmvm=Ac=A(ATA)−1ATx.5.41. A basis of ker(AT) is−32. (See Figure 5.8.)3. We will first show that the vectorsv1, . . . , vp, w1, . . . , wqspanRn. Any vectorvinRncanbe written asv=v+v⊥, wherevis inVandv⊥is inV⊥(by definition of a projection,Fact 5.1.4).Nowvis a linear combination ofv1, . . . , vp, andv⊥is a linear combination ofw1, . . . , wq,showing that the vectorsv1, . . . , vp, w1, . . . , wqspanRn.Note thatp+q=n, by Fact 5.1.8c; therefore, the vectorsv1, . . . , vp, w1, . . . , wqform abasis ofRn, by Fact 3.3.4d.5.V= ker(A), whereA=11111254.ThenV⊥= (kerA)⊥= im(AT), by Exercise 4.136
SSM:Linear AlgebraSection 5.4Figure 5.8: for Problem 5.4.1 .The two columns ofATform a basis ofV⊥:1111,12547. im(A) and ker(A) are orthogonal complements by Fact 5.4.1:(imA)⊥= ker(AT) = ker(A)9. See Figure 5.9.x0is the shortest of all the vectors inS. (See Figure 5.9.)11. a. Note thatL+(y) =AT(AAT)−1y; indeed, this vector is in im(AT) = (kerA)⊥, and itis a solution ofL(x) =Ax=y.