This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Copyright c 2007 by Gilbert Strang Copyright c 2007 by Gilbert Strang Linear Algebra In A Nutshell 685 LINEAR ALGEBRA IN A NUTSHELL One question always comes on the first day of class. Do I have to know linear algebra ? My reply gets shorter every year: You soon will . This section brings together many important points in the theory. It serves as a quick primer, not an ocial part of the applied mathematics course (like Chapter 1 and 2). This summary begins with two lists that use most of the key words of linear algebra. The first list applies to invertible matrices. That property is described in 14 different ways. The second list shows the contrast, when A is singular (not invertible). There are more ways to test invertibility of an n by n matrix than I expected. Nonsingular Singular A is invertible A is not invertible The columns are independent The columns are dependent The rows are independent The rows are dependent The determinant is not zero The determinant is zero Ax = 0 has one solution x = 0 Ax = 0 has infinitely many solutions Ax = 0 has one solution x = A- 1 b A x = b has no solution or infinitely many A has n (nonzero) pivots A has r < n pivots A has full rank A has rank r < n The reduced row echelon form is R = I R has at least one zero row The column space is all of R n The column space has dimension r < n The row space is all of R n The row space has dimension r < n All eigenvalues are nonzero Zero is an eigenvalue of A A T A is symmetric positive definite A T A is only semidefinite A has n (positive) singular values A has r < n singular values Now we take a deeper look at linear equations, without proving every statement we make. The goal is to discover what Ax = b really means. One reference is my textbook Introduction to Linear Algebra , published by Wellesley-Cambridge Press. That book has a much more careful development with many examples (you could look at the course page, with videos of the lectures, on ocw.mit.edu or web.mit.edu/18.06 ). The key is to think of every multiplication Ax , a matrix A times a vector x , as a combination of the columns of A : Matrix Multiplication by Columns 1 2 3 6 C D = C 1 3 + D 2 6 = combination of columns . Multiplying by rows, the first component C + 2 D comes from 1 and 2 in the first row of A . But I strongly recommend to think of Ax a column at a time . Notice how Copyright c 2007 by Gilbert Strang Copyright c 2007 by Gilbert Strang 686 Linear Algebra In A Nutshell x = (1 , 0) and x = (0 , 1) will pick out single columns of A : 1 2 3 6 1 = first column 1 2 3 6 1 = last column . Suppose A is an m by n matrix. Then Ax = 0 has at least one solution, the all-zeros vector x = 0. There are certainly other solutions in case n > m (more unknowns than equations). Even if m = n , there might be nonzero solutions to Ax = 0; then A is square but not invertible. It is the number r of independent rows and columns that counts. That number r is the rank of A ( r m and r n )....
View Full Document