Lecture_9_1

# Lecture_9_1 - Lecture Note 9-1 Nov 7 Dr Jeff Chak-Fu WONG...

This preview shows pages 1–7. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Lecture Note 9-1: Nov 7 - Nov 10, 2006 Dr. Jeff Chak-Fu WONG Department of Mathematics Chinese University of Hong Kong [email protected] MAT 2310 Linear Algebra and Its Applications Fall, 2006 Produced by Jeff Chak-Fu WONG 1 A PPLICATIONS OF R EAL V ECTOR S PACES 1. QR-Factorization 2. Least Squares APPLICATIONS OF REAL VECTOR SPACES 2 QR-F ACTORIZATION We discussed the LU-factorization of a matrix and showed how it leads to a very efficient method for solving a linear system. We now discuss another factorization of a matrix A , called the QR-factorization of A . This type of factorization is widely used in computer codes 1. to find the eigenvalues of a matrix (we shall discuss this topic soon), 2. to solve linearly systems, and 3. to find least squares approximations (in this lecture). QR-FACTORIZATION 3 Remark The Gram-Schmidt process with subsequent normalization not only converts an arbitrary basis { u 1 , u 2 , ··· , u n } into an orthonormal basis { w 1 , w 2 , ··· , w n } , but it does it in such a way that for k ≥ 2 the following relationships hold: • { w 1 , w 2 , ··· , w k } is an orthonormal basis for the space spanned by { u 1 , u 2 , ··· , u k } . • w k is orthogonal to { u 1 , u 2 , ··· , u k- 1 } . QR-FACTORIZATION 4 Theorem 0.1 If A is an m × n matrix with linearly independent columns, then A can be factored as A = QR , where Q is an m × n matrix whose columns form an orthonormal basis for the column space of A and R is an n × n nonsingular upper triangular matrix. Proof. Let u 1 , u 2 ,..., u n denote the linearly independent columns of A , which form a basis for the column space of A . By using the Gram-Schmidt process (see Theorem 0.3 in Lecture Note 7-2), we can obtain an orthonormal basis w 1 , w 2 ,..., w n for the column space of A . Recall how this orthonormal basis was obtained. We first constructed an orthogonal basis v 1 , v 2 ,..., v n as follows: v 1 = u 1 and then for i = 2 , 3 ,...,n we have v i = u i- u i · v 1 v 1 · v 1 v 1- u i · v 2 v 2 · v 2 v 2- u i · v i- 1 v i- 1 · v i- 1 v i- 1 . (1) Finally, w i = 1 k v i k v i for i = 1 , 2 , 3 ,...,n . Now each of the vectors u i can be QR-FACTORIZATION 5 written as a linear combination of the w-vectors: u 1 = ( u 1 · w 1 ) w 1 + ( u 1 · w 2 ) w 2 + ··· + ( u 1 · w n ) w n u 2 = ( u 2 · w 1 ) w 1 + ( u 2 · w 2 ) w 2 + ··· + ( u 2 · w n ) w n . . . u n = ( u n · w 1 ) w 1 + ( u n · w 2 ) w 2 + ··· + ( u n · w n ) w n . (2) From Theorem 0.2 (cf. Lecture Note 7-2) we have r ji = u i · w j . Moreover, from Equation (1), we see that u i lies in span { v 1 , v 2 ,..., v i } = span { w 1 , w 2 ,..., w i } . Since w j is orthogonal to span { w 1 , w 2 ,..., w i } for j > i , it is orthogonal to u i ....
View Full Document

## This note was uploaded on 10/15/2009 for the course MATHEMATIC MAT2310B taught by Professor Jeffwong during the Spring '09 term at CUHK.

### Page1 / 60

Lecture_9_1 - Lecture Note 9-1 Nov 7 Dr Jeff Chak-Fu WONG...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online