This preview shows page 1. Sign up to view the full content.
Unformatted text preview: matrix. Let he thin QR factorization of [ A,b ] be A b = Q w R z , where w t Q = 0 ( w is just q n +1 ). Note that A = QR and b = Qz + w . Then the rhs for (LS), Q t b , is Q t b = Q t ( Qz + w ) = z , and we can solve (LS) using backward substitution on Rx = z. Now do you really think that this removes the ( A ) factor which came from the orthogonality errors in Q ? Why should it? I applaud you for your skepticism. The answer lies in the (substantial) dierences in behavior between MGS and CGS. Explicitly computing Q t b as Z = Q t * b is the CGS way, but MGS adapts to the errors made in each inner product, giving a z which has, (to the extent that it can), accounted for any nonorthogonality in the columns of Q ....
View Full Document
- Fall '10