This preview shows pages 1–8. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Lecture Note 18 Dr. Jeff ChakFu WONG Department of Mathematics Chinese University of Hong Kong jwong@math.cuhk.edu.hk MAT 2310 Linear Algebra and Its Applications Fall, 2007 Produced by Jeff ChakFu WONG 1 A PPLICATIONS OF R EAL V ECTOR S PACES 1. Least Squares 2. QRFactorization APPLICATIONS OF REAL VECTOR SPACES 2 L EAST S QUARES Prerequisites: 1. Solutions of Linear Systems of Equations 2. The Inverse of a Matrix 3. nVectors 4. Orthogonal Complements LEAST SQUARES 3 We recall that • An m × n linear system A x = b is inconsistent if it has no solution. • – A x = b is consistent if and only if b belongs to the column space of A (cf. Lecture Note 11, Theorem 0.6, Page 65). – Equivalently, A x = b in inconsistent if and only if b is not in the column space of A . Inconsistent systems do indeed arise in many situations and we must determine how to deal with them. LEAST SQUARES 4 Our approach is to change the problem so that we do not require that the matrix equation A x = b be satisfied. Instead, we seek a vector ˆ x in R n such that A ˆ x is as close to b as possible. If W is the column space of A , it follows that the vector in W that is closest to b is proj W b . That is, k b w k , for w in W , is minimized when w = proj W b . Thus, if we find ˆ x such that A ˆ x = proj W b , then we are assured that k b A ˆ x k will be as small as possible (see Figure 1). W W = Column space of A proj W b b A x = w = b  w A x Figure 1: A solution x produces the vector A x in W closest to b LEAST SQUARES 5 As shown in the proof of Theorem 0.5 (cf. Lecture Note 17, Page 49), b proj W b = b A ˆ x is orthogonal to every vector in W . (see Figure 2). In other words, b proj W b = b A x is orthogonal to W . But W is the column space of A , so follows from Theorem 0.4 (Lecture note 17) that b A x lies in the null space of A T . W W = Column space of A proj W b b Figure 2: It then follows that b A ˆ x is orthogonal to each column of A , In terms of a matrix equation, we have A T ( A ˆ x b ) = LEAST SQUARES 6 or equivalently, A T A ˆ x = A T b . Thus, ˆ x is a solution to A T A x = A T b . (1) Any solution to (1) is called a least squares solution to the linear system A x = b . ( Warnings: In general, A ˆ x 6 = b .) Equation (1) is called the normal system of equations associated with A x = b , or just the normal system. Note that x is a least squares solution of A x = b if and only if b A x belongs to W ⊥ . Now b = A x + ( b A x ) , A x belongs to W . Observe that if A x = b is consistent, then a solution to this system is a least squares solution. In particular, if A is nonsingular, a least squares solution to A x = b is just the usual solution x = A 1 b ....
View Full
Document
 Spring '06
 Dr.JeffChakFuWONG
 Linear Algebra, Algebra, Least Squares

Click to edit the document details