linear_least_squares

# linear_least_squares - Linear Least Squares Suppose we are...

This preview shows pages 1–3. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Linear Least Squares Suppose we are given a set of data points { ( x i , f i ) } , i = 1 , . . ., n . These could be mea- surements from an experiment or obtained simply by evaluating a function at some points. You have seen that we can interpolate these points, i.e., either find a polynomial of de- gree ≤ ( n − 1) which passes through all n points or we can use a continuous piecewise interpolant of the data which is usually a better approach. How, it might be the case that we know that these data points should lie on, for example, a line or a parabola, but due to experimental error they do not. So what we would like to do is find a line (or some other higher degree polynomial) which best represents the data. Of course, we need to make precise what we mean by a “best fit” of the data. As a concrete example suppose we have n points ( x 1 , f 1 ) , ( x 2 , f 2 ) , ··· ( x n , f n ) and we expect them to lie on a straight line but due to experimental error, they don’t. We would like to draw a line and have the line be the best representation of the points. If n = 2 then the line will pass through both points and so the error is zero at each point. However, if we have more than two data points, then we can’t find a line that passes through the three points (unless they happen to be collinear) so we have to find a line which is a good approximation in some sense. Of course we need to define what we mean by a good representation. An obvious approach would be to create an error vector of length n and each component measures the difference ( f i − y ( x i )) where y = a 1 x + a is the line we fit the data with. Then we can take a norm of this error vector and our goal would be to find the line which minimizes this error vector. Of course this problem is not clearly defined because we have not specified what norm to use. The linear least squares problem finds the line which minimizes this difference in the ℓ 2 (Euclidean) norm. Example We want to fit a line p 1 ( x ) = a + a 1 x to the data points (1 , 2 . 2) , ( . 8 , 2 . 4) , (0 , 4 . 25) in a linear least squares sense. For now, we will just write the overdetermined system and determine if it has a solution. We will find the line after we investigate how to solve the linear least squares problem. Our equations are a + a 1 ∗ 1 = 2 . 2 a + a 1 ∗ . 8 = 2 . 4 a + a 1 ∗ 0 = 4 . 25 1 Writing this as a matrix problem Avectorx = vector b we have 1 1 1 0 . 8 1 parenleftbigg a a 1 parenrightbigg = 2 . 1 2 . 4 4 . 25 Now we know that this over-determined problem has a solution if the right hand side is in R ( A ) (i.e., it is a linear combination of the columns of the coefficient matrix A ). Here the rank of A is clearly 2 and thus not all of IR 3 . Moreover, (2 . 1 , 2 . 4 , 4 . 25) T is not in the R ( A ), i.e., not in the span { (1 , 1 , 1) T , (1 , . 8 , 0) T } and so the system doesn’t have a solution. This just means that we can’t find a line that passes through all three points.just means that we can’t find a line that passes through all three points....
View Full Document

## This note was uploaded on 01/15/2012 for the course ISC 5315 taught by Professor Staff during the Spring '11 term at FSU.

### Page1 / 9

linear_least_squares - Linear Least Squares Suppose we are...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online