# Lecture22 - Engineering Analysis ENG 3420 Fall 2009 Dan C....

This preview shows pages 1–8. Sign up to view the full content.

Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 Lecture 22 Lecture 22 Attention: The last homework HW5 and the last project are due on Tuesday November 24!! Last time: Linear regression Exponential, power, and saturation non-linear models Linear least squares regression Today Linear regression versus sample mean. Coefficient of determination Polynomial least squares fit Multiple linear regression General linear squares More on non-linear models Interpolation (Chapter 15) Polynomial interpolation Newton interpolating polynomials Lagrange interpolating polynomials Next Time Splines
Quantification of Errors For a straight line the sum of the squares of the estimate residuals is: The standard error of the estimate: s y / x = S r n - 2 S r = e i 2 i = 1 n = y i - a 0 - a 1 x i ( 29 2 i = 1 n

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Linear regression versus the sample mean What is the difference between linear regression and the case when we simply compute the sample mean and draw a line corresponding to the sample mean? The spread the histogram of the differences between the values predicted by linear regression and the actual sample values. Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: The reduction in spread represents the improvement due to linear regression.
Coefficient of Determination The coefficient of determination r 2 r 2 represents the percentage of the original uncertainty explained by the model. For a perfect fit, S r =0 and r 2 =1. If r 2 =0, there is no improvement over simply picking the mean. If r 2 <0, the model is worse than simply picking the mean! r 2 = S t - S r S t ( 29 = - = n i i t y y S 1 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Example V (m/s) F (N) i x i y i a 0 +a 1 x i (y i - ȳ ) 2 (y i -a 0 -a 1 x i ) 2 1 10 25 -39.58 380535 4171 2 20 70 155.12 327041 7245 3 30 380 349.82 68579 911 4 40 550 544.52 8441 30 5 50 610 739.23 1016 16699 6 60 1220 933.93 334229 81837 7 70 830 1128.63 35391 89180 8 80 1450 1323.33 653066 16044 Σ 360 5135 1808297 216118 F est = - 234.2857 + 19.47024 v S t = y i - y ( 29 2 = 1808297 S r = y i - a 0 - a 1 x i ( 29 2 = 216118 s y = 1808297 8 - 1 = 508.26 s y / x = 216118 8 - 2 = 189.79 r 2 = 1808297 - 216118 1808297 = 0.8805 88.05% of the original uncertainty has been explained by the linear model
MATLAB has a built-in function polyfit that fits a least-squares n-th order polynomial to data: p = polyfit( x , y , n ) x : independent data y : dependent data n : order of polynomial to fit p : coefficients of polynomial f ( x )= p 1 x n + p 2 x n -1 +…+ p n x + p n +1 MATLAB’s polyval command can be used to compute a value using the coefficients. y

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 02/17/2012 for the course EGN 3420 taught by Professor Staff during the Spring '08 term at University of Central Florida.

### Page1 / 33

Lecture22 - Engineering Analysis ENG 3420 Fall 2009 Dan C....

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online