This preview shows pages 1–7. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture Note 91: Nov 7  Nov 10, 2006 Dr. Jeff ChakFu WONG Department of Mathematics Chinese University of Hong Kong jwong@math.cuhk.edu.hk MAT 2310 Linear Algebra and Its Applications Fall, 2006 Produced by Jeff ChakFu WONG 1 A PPLICATIONS OF R EAL V ECTOR S PACES 1. QRFactorization 2. Least Squares APPLICATIONS OF REAL VECTOR SPACES 2 QRF ACTORIZATION We discussed the LUfactorization of a matrix and showed how it leads to a very efficient method for solving a linear system. We now discuss another factorization of a matrix A , called the QRfactorization of A . This type of factorization is widely used in computer codes 1. to find the eigenvalues of a matrix (we shall discuss this topic soon), 2. to solve linearly systems, and 3. to find least squares approximations (in this lecture). QRFACTORIZATION 3 Remark The GramSchmidt process with subsequent normalization not only converts an arbitrary basis { u 1 , u 2 , ··· , u n } into an orthonormal basis { w 1 , w 2 , ··· , w n } , but it does it in such a way that for k ≥ 2 the following relationships hold: • { w 1 , w 2 , ··· , w k } is an orthonormal basis for the space spanned by { u 1 , u 2 , ··· , u k } . • w k is orthogonal to { u 1 , u 2 , ··· , u k 1 } . QRFACTORIZATION 4 Theorem 0.1 If A is an m × n matrix with linearly independent columns, then A can be factored as A = QR , where Q is an m × n matrix whose columns form an orthonormal basis for the column space of A and R is an n × n nonsingular upper triangular matrix. Proof. Let u 1 , u 2 ,..., u n denote the linearly independent columns of A , which form a basis for the column space of A . By using the GramSchmidt process (see Theorem 0.3 in Lecture Note 72), we can obtain an orthonormal basis w 1 , w 2 ,..., w n for the column space of A . Recall how this orthonormal basis was obtained. We first constructed an orthogonal basis v 1 , v 2 ,..., v n as follows: v 1 = u 1 and then for i = 2 , 3 ,...,n we have v i = u i u i · v 1 v 1 · v 1 v 1 u i · v 2 v 2 · v 2 v 2 u i · v i 1 v i 1 · v i 1 v i 1 . (1) Finally, w i = 1 k v i k v i for i = 1 , 2 , 3 ,...,n . Now each of the vectors u i can be QRFACTORIZATION 5 written as a linear combination of the wvectors: u 1 = ( u 1 · w 1 ) w 1 + ( u 1 · w 2 ) w 2 + ··· + ( u 1 · w n ) w n u 2 = ( u 2 · w 1 ) w 1 + ( u 2 · w 2 ) w 2 + ··· + ( u 2 · w n ) w n . . . u n = ( u n · w 1 ) w 1 + ( u n · w 2 ) w 2 + ··· + ( u n · w n ) w n . (2) From Theorem 0.2 (cf. Lecture Note 72) we have r ji = u i · w j . Moreover, from Equation (1), we see that u i lies in span { v 1 , v 2 ,..., v i } = span { w 1 , w 2 ,..., w i } . Since w j is orthogonal to span { w 1 , w 2 ,..., w i } for j > i , it is orthogonal to u i ....
View Full
Document
 Spring '09
 JeffWong
 Linear Algebra, Algebra

Click to edit the document details