p1037Fall07

p1037Fall07 - EE103 Fall 2007 Lecture Notes (SEJ) Section 7...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
EE103 Fall 2007 Lecture Notes (SEJ) Section 7 99 SECTION 7: INTRODUCTION TO LEAST SQUARES APPROXIMATION (with Application to Polynomial Approximation) ............................................................... 100 Least Squares ............................................................................................................ 100 Example 1 (Polynomial linear least squares approximation) ........................... 101 Example 2: (Multiple linear regression) ............................................................. 103 Example 3 (Numerical example of polynomial approximation) ....................... 104 Geometric Interpretation and Orthogonality ........................................................ 109 The Gram-Schmidt Process and QR Factorization ............................................... 111
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
EE103 Fall 2007 Lecture Notes (SEJ) Section 7 100 SECTION 7: INTRODUCTION TO LEAST SQUARES APPROXIMATION (with Application to Polynomial Approximation) Consider the system of linear equations A xb = where A is mxn, m > n, and the columns of 12 n Aaa a , ,,, " , are linearly independent. Such a system of equations usually has no solution. We define, for a given n x R , the error vector ex A x b = () The minimum error problem is the problem of choosing an n x R that minimizes the norm of the error, ||() || . The minimum error problem is, therefore, n xR min ||() || Least Squares Least-squares problems are those for which the norm is the 2 l or Euclidean norm 2 T zz z = || || In this case we may write 22 1 nn n n m T i i ex ex e x ∈∈ = ⇔= = min ( ) ( ) min ( ) That is, if the Euclidean norm is used, we choose an x that minimizes the sum of the squared errors; hence the term "least squares". Let T f xe x e x = () () () ; we wish to minimize f x and, of course, if x is a minimizer, then x must satisfy the vector equation 0 fx = When this vector equation is linear in the unknowns, x , we have a so-called linear least squares problem. For the remainder of this section, we will focus on the linear least squares problem.
Background image of page 2
EE103 Fall 2007 Lecture Notes (SEJ) Section 7 101 Now, 2 22 TT T T t T T f x e x e x Ax b Ax b x A Ax b Ax b b fx xAA bA == = + ⇒∇ = () () () ( )( ) () . Therefore, 0 T f xx A A b A ∇= = or, by taking the transpose of this latter equation, A Ax A b = These equations are called the normal equations, and we'll soon learn of the meaning of that term. T AA is nxn and nonsingular and positive definite. Therefore, mathematically 1 x AA Ab = (this is a mathematical expression and is not to be used for computation). Since the columns of A are linearly independent, the matrix T is PD (Positive Definite) and therefore Choleski's method may be employed to factor T as AA LL = where L is a lower triangular matrix. Given this Choleski factorization, the equation LL x A b = may be simply solved by forward and back substitution: solves by forward substitution solves by back substitution T T yL y A b xL x y = = , . Example 1 (Polynomial linear least squares approximation) (Polynomial linear least squares approximation): We wish to approximate the n data points 11 2 2 nn x yx y x y (, ) , , , " with an th m degree polynomial 1 0 mm Px ax a x ax a =+ + + + " where 1 mn ≤− . That is, we would like to find coefficients 10 aa a ,, " so that, if possible,
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
EE103 Fall 2007 Lecture Notes (SEJ) Section 7 102 () 11 1 1 0 1 22 1 2 0 2 10 ?
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/22/2009 for the course EE 103 taught by Professor Vandenberghe,lieven during the Fall '08 term at UCLA.

Page1 / 18

p1037Fall07 - EE103 Fall 2007 Lecture Notes (SEJ) Section 7...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online