This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Least Squares Data Fitting Existence, Uniqueness, and Conditioning Solving Linear Least Squares Problems Scientic Computing: An Introductory Survey Chapter 3 Linear Least Squares Prof. Michael T. Heath Department of Computer Science University of Illinois at UrbanaChampaign Copyright c 2002. Reproduction permitted for noncommercial, educational use only. Michael T. Heath Scientic Computing 1 / 61 Least Squares Data Fitting Existence, Uniqueness, and Conditioning Solving Linear Least Squares Problems Outline 1 Least Squares Data Fitting 2 Existence, Uniqueness, and Conditioning 3 Solving Linear Least Squares Problems Michael T. Heath Scientic Computing 2 / 61 Least Squares Data Fitting Existence, Uniqueness, and Conditioning Solving Linear Least Squares Problems Least Squares Data Fitting Method of Least Squares Measurement errors are inevitable in observational and experimental sciences Errors can be smoothed out by averaging over many cases, i.e., taking more measurements than are strictly necessary to determine parameters of system Resulting system is overdetermined , so usually there is no exact solution In effect, higher dimensional data are projected into lower dimensional space to suppress irrelevant detail Such projection is most conveniently accomplished by method of least squares Michael T. Heath Scientic Computing 3 / 61 Least Squares Data Fitting Existence, Uniqueness, and Conditioning Solving Linear Least Squares Problems Least Squares Data Fitting Linear Least Squares For linear problems, we obtain overdetermined linear system Ax = b , with m n matrix A , m > n System is better written Ax = b , since equality is usually not exactly satisable when m > n Least squares solution x minimizes squared Euclidean norm of residual vector r = bAx , min x k r k 2 2 = min x k bAx k 2 2 Michael T. Heath Scientic Computing 4 / 61 Least Squares Data Fitting Existence, Uniqueness, and Conditioning Solving Linear Least Squares Problems Least Squares Data Fitting Data Fitting Given m data points ( t i , y i ) , nd nvector x of parameters that gives best t to model function f ( t, x ) , min x m X i =1 ( y if ( t i , x )) 2 Problem is linear if function f is linear in components of x , f ( t, x ) = x 1 1 ( t ) + x 2 2 ( t ) + + x n n ( t ) where functions j depend only on t Problem can be written in matrix form as Ax = b , with a ij = j ( t i ) and b i = y i Michael T. Heath Scientic Computing 5 / 61 Least Squares Data Fitting Existence, Uniqueness, and Conditioning Solving Linear Least Squares Problems Least Squares Data Fitting Data Fitting Polynomial tting f ( t, x ) = x 1 + x 2 t + x 3 t 2 + + x n t n1 is linear, since polynomial linear in coefcients, though nonlinear in independent variable t Fitting sum of exponentials f ( t, x ) = x 1 e x 2 t + + x n1 e x n t is example of nonlinear problem For now, we will consider only linear least squares problems Michael T. Heath Michael T....
View
Full
Document
This note was uploaded on 01/16/2011 for the course MATH 224 taught by Professor Layton,a during the Fall '08 term at Duke.
 Fall '08
 Layton,A
 Least Squares

Click to edit the document details