every GaussJordan step is a multiplication on the left by an elementary matrix

# Every gaussjordan step is a multiplication on the

• Notes
• 68
• 100% (3) 3 out of 3 people found this document helpful

This preview shows page 49 - 51 out of 68 pages.

every Gauss–Jordan step is a multiplication on the left by an elementary matrix . We are allowing three types of elementary matrices: 1. E i j to subtract a multiple of row j from row i 2. P i j to exchange rows i and j 3. D (or D 1 ) to divide all rows by their pivots. The Gauss–Jordan process is really a giant sequence of matrix multiplications: ( D 1 ··· E ··· P ··· E ) A = I . (6) That matrix in parentheses, to the left of A , is evidently a left-inverse! It exists, it equals the right-inverse by Note 2, so every nonsingular matrix is invertible . The converse is also true: If A is invertible, it has n pivots . In an extreme case that is clear: A cannot have a whole column of zeros. The inverse could never multiply a column of zeros to produce a column of I . In a less extreme case, suppose elimination starts on an invertible matrix A but breaks down at column 3: Breakdown No pivot in column 3 A = d 1 x x x 0 d 2 x x 0 0 0 x 0 0 0 x . This matrix cannot have an inverse, no matter what the x ’s are. One proof is to use column operations (for the first time?) to make the whole third column zero. By subtracting multiples of column 2 and then of column 1, we reach a matrix that is certainly not invertible. Therefore the original A was not invertible. Elimination gives a complete test: An n by n matrix is invertible if and only if it has n pivots. The Transpose Matrix We need one more matrix, and fortunately it is much simpler than the inverse. The transpose of A is denoted by A T . Its columns are taken directly from the rows of A the ith row of A becomes the ith column of A T : Transpose If A = 2 1 4 0 0 3 then A T = 2 0 1 0 4 3 . At the same time the columns of A become the rows of A T . If A is an m by n matrix, then A T is n by m . The final effect is to flip the matrix across its main diagonal, and the entry in row i , column j of A T comes from row j , column i of A : Entries of A T ( A T ) i j = A ji . (7)
NOT FOR SALE Strang-5060 book April 25, 2005 17:38 50 50 Chapter 1 Matrices and Gaussian Elimination The transpose of a lower triangular matrix is upper triangular. The transpose of A T brings us back to A . If we add two matrices and then transpose, the result is the same as first transposing and then adding: ( A + B ) T is the same as A T + B T . But what is the transpose of a product AB or an inverse A 1 ? Those are the essential formulas of this section: 1M (i) The transpose of AB is (ii) The transpose of A 1 is ( AB ) T = B T A T . ( A 1 ) T = ( A T ) 1 . Notice how the formula for ( AB ) T resembles the one for ( AB ) 1 . In both cases we reverse the order, giving B T A T and B 1 A 1 . The proof for the inverse was easy, but this one requires an unnatural patience with matrix multiplication. The first row of ( AB ) T is the first column of AB . So the columns of A are weighted by the first column of B . This amounts to the rows of A T weighted by the first row of B T . That is exactly the first row of B T A T . The other rows of ( AB ) T and B T A T also agree.