Lec9 - Curve Fitting Regression and Interpolation...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Curve Fitting Regression and Interpolation
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Lecture 9 2 Curve-Fitting: Overview z As engineers, you will need to learn to manage a variety of types of data Experimental data , e.g. PVT data, or time- dependent process data Numerical data , e.g. from a numerical solution to a differential equation, or a computer simulation of a fluid z Often you would like to fit your set of data, say a set of (x i ,y i ) points to a smooth function y=f(x) in order to capture or describe the data in a simple form z There are two major approaches based on whether the data is smooth or noisy : Noisy data -> use regression methods Smooth data -> use interpolation methods
Background image of page 2
Lecture 9 3 Statistics review z Suppose we are given a set of noisy data. z Three important quantities are the mean , the variance , and the standard deviation :
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Lecture 9 4 Regression z Regression refers to the process of trying to “fit” a smooth function through a set of noisy data z This function could be A line (“linear regression”) A higher polynomial An arbitrary function with adjustable parameters z Regression is “over- determined” in the sense that there are usually many more data points than adjustable parameters in the function
Background image of page 4
Lecture 9 5 Least-Squares Regression z There are many criteria that can be used to determine a “best fit” of a data set to a function z The most popular is the “least-squares criterion”, namely that the squared error or “residual” e i between the data points and the model, summed over all points, is as small as possible: z Notice that if the model is just a constant equal to the mean of the data set, then S r is simply related to the variance, z More generally, S r > 0 is a mean-squared error between the data and the model
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Lecture 9 6 Linear Least Squares z Linear least squares is a special case in which the model is a straight line: Two parameters: a 0 and a 1 z Criterion for choosing the parameters: S r should be minimized with respect to a 0 and a 1
Background image of page 6
7 Linear Least Squares z But since we know the data values (x i ,y i ), all the sums are just numerical coefficients. This is a 2 x 2 system of linear equations in the unknowns a 0 and a 1 that is easy to solve! z
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 24

Lec9 - Curve Fitting Regression and Interpolation...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online