EE103 Section 6

EE103 Section 6 - EE103 Winter 2009 Lecture Notes (SEJ)...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
EE103 Winter 2009 Lecture Notes (SEJ) Section 6 111 SECTION 6: INTRODUCTION TO LEAST SQUARES APPROXIMATION (with Application to Polynomial Approximation) ............................................................... 112 Linear Least Squares (LLS) ..................................................................................... 112 Example 1 (Polynomial linear least squares approximation) ........................... 113 Example 2: (Multiple linear regression) ............................................................. 115 Example 3 (Numerical example of polynomial approximation) ....................... 116 Geometric Interpretation and Orthogonality ........................................................ 117 The Gram-Schmidt Process and QR Factorization ............................................... 120 Examples of Classic and Modified Gram-Schmidt, and Choleski Implementations .................................................................................................... 123
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
EE103 Winter 2009 Lecture Notes (SEJ) Section 6 112 SECTION 6: INTRODUCTION TO LEAST SQUARES APPROXIMATION (with Application to Polynomial Approximation) Consider the system of linear equations A xb = where A is mxn, m > n, and the columns of 12 n Aaa a , ,,, " , are linearly independent (i.e., rank A = n ). Such a system of equations usually has no solution. We define, for a given n x R , the error vector ex A x b = () The minimum norm problem is the problem of choosing an n x R that minimizes the norm of the error, ||() || . For the purposes of these notes, the only norms that are of concern are the following: {} 1 1 2 2 1 1, , 1. 2. 3. max n i i n t i i i in zz z z = = = = == = " The minimum error problem is, therefore, n xR min ||() || Linear Least Squares (LLS) Least-squares problems are those for which the norm is the Euclidean norm 2 T z = || || In this case we may write 22 1 nn n n m T i i ex ex e x ∈∈ = ⇔= = min ( ) ( ) min ( ) That is, if the Euclidean norm is used, we choose an x that minimizes the sum of the squared errors; hence the term “least squares”. Let 2 2 T f xe x e x e x () () () ; we wish to minimize f x and, of course, if x is a minimizer, then x must satisfy the vector equation 0 fx =
Background image of page 2
EE103 Winter 2009 Lecture Notes (SEJ) Section 6 113 When this vector equation is linear in the unknowns, x , we have a so-called linear least squares problem. For the remainder of this section, we will focus on the linear least squares problem. Now, 2 22 TT T T t T T f x e x e x Ax b Ax b x A Ax b Ax b b fx xAA bA == = + ⇒∇ = () () () ( )( ) () . Therefore, 0 T f xx A A b A ∇= = or, by taking the transpose of this latter equation, AA x Ab = These equations are called the normal equations , and we'll soon learn of the meaning of that term. T is x nn , nonsingular, and positive definite. Therefore, mathematically 1 x AA Ab = (this is a mathematical expression and is not to be used for computation). Since the columns of A are linearly independent, the matrix T is PD (Positive Definite) and therefore Choleski's method may be employed to factor T as AA LL = where L is a lower triangular matrix. Given this Choleski factorization, the equation LL x A b = may be simply solved by forward and back substitution: solves by forward substitution solves by back substitution T T yL y A b xL x y = = , . Example 1 (Polynomial linear least squares approximation) (Polynomial linear least squares approximation): We wish to approximate the n data points 11 2 2 x yx y x y (, ) , , , " with an th m degree polynomial 1 0 mm Px ax a x ax a =+ + + + " where 1 mn ≤− .
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 18

EE103 Section 6 - EE103 Winter 2009 Lecture Notes (SEJ)...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online