{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

linear_algebra_part2

linear_algebra_part2 - The QR Decomposition We have seen...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
The QR Decomposition We have seen our first decomposition of a matrix, A = LU (and its variants). This was valid for a square matrix and aided us in solving the linear system Avectorx = vector b . The QR decomposition is valid for rectangular matrices as well square ones. We will see that this decomposition can be used for solving n × n linear systems but is also useful in solving overdetermined systems such as those in linear least squares. The decomposition will be used in a general algorithm for finding all eigenvalues and eigenvectors of a matrix.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
The QR Decomposition of a square matrix Let A be an n × n matrix with linearly independent columns. Then A can be uniquely written as A = QR where Q is orthogonal (unitary in general) and R is an upper triangular matrix with positive diagonal entries. Outline of Proof The n × n matrix A T A is symmetric and positive definite and thus it can be written uniquely as A = LL T where L is lower triangular with positive diagonal entries. Show Q = A ( L T ) 1 is an orthogonal matrix. Then A = QL T so set R = L T and we are done because L has positive diagonal entries. Uniqueness is demonstrated by assuming we have two such decompositions and getting a contradiction.
Background image of page 2
So all we have to do to show existence is demonstrate that Q = A ( L T ) 1 is an orthogonal matrix which means we have to demonstrate that QQ T = I . Using the fact that A T A = LL T we have Q T Q = ( A ( L T ) 1 ) T ( A ( L T ) 1 ) = L 1 A T A ( L T ) 1 = L 1 LL T ( L T ) 1 = I All that remains is to verify uniqueness which should rely on the fact that the Cholesky decomposition is unique once we choose the sign of the diagonal entries. We assume there are two such decompositions and get a contradiction. Let A = Q 1 R 1 and A = Q 2 R 2 where Q T 1 Q 1 = I and Q T 2 Q 2 = I and R 1 negationslash = R 2 . Now writing A T A with each of these two decompositions gives A T A = ( Q 1 R 1 ) T ( Q 1 R 1 ) = R T 1 Q T 1 Q 1 R 1 = R T 1 R 1 and A T A = ( Q 2 R 2 ) T ( Q 2 R 2 ) = R T 2 Q T 2 Q 2 R 2 = R T 2 R 2 Thus A T A = R T 1 R 1 = R T 2 R 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
But this says that there are two different LL T decompositions of A T A where each L has positive diagonal entries and thus we have a contradiction and the decomposition is unique. The proof of this theorem actually gives us a way to construct a QR decom- position of a matrix. We first form A T A , do a Cholesky decomposition and thus have R and form Q = AR 1 . This can be done by hand, but is NOT a good approach computationally. The QR decomposition can be used to solve a linear system Avectorx = vector b . We have Avectorx = vector b = QRvectorx = vector b = Q T QRvectorx = Q T vector b = Rvectorx = Q T vectorx which is an upper triangular matrix. So once we have the factorization we have to do a matrix vector multiplication and solve an upper triangular system; both operations are O ( n 2 ) .
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}