Unformatted text preview: matrix. Let he thin QR factorization of [ A,b ] be ± A b ² = ± Q w ² ³ R z β ´ , where w t Q = 0 ( w is just q n +1 ). Note that A = QR and b = Qz + βw . Then the rhs for (LS), Q t b , is Q t b = Q t ( Qz + βw ) = z , and we can solve (LS) using backward substitution on Rx = z. Now do you really think that this removes the κ ( A ) factor which came from the orthogonality errors in Q ? Why should it? I applaud you for your skepticism. The answer lies in the (substantial) diﬀerences in behavior between MGS and CGS. Explicitly computing Q t b as Z = Q t * b is the CGS way, but MGS adapts to the errors made in each inner product, giving a z which has, (to the extent that it can), “accounted for” any nonorthogonality in the columns of Q ....
View
Full Document
 Fall '10
 MARK
 Qt, MGS, Gram–Schmidt process, GramSchmidt QR factorization, Rm×n satisﬁes Qt

Click to edit the document details