Lecture22 - Engineering Analysis ENG 3420 Fall 2009 Dan C....

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 Lecture 22 Lecture 22 ± Attention: The last homework HW5 and the last project are due on Tuesday November 24!! ± Last time: ² Linear regression ² Exponential, power, and saturation non-linear models ² Linear least squares regression ± Today ² Linear regression versus sample mean. Coefficient of determination ² Polynomial least squares fit ² Multiple linear regression ² General linear squares ² More on non-linear models ² Interpolation (Chapter 15) ± Polynomial interpolation ± Newton interpolating polynomials ± Lagrange interpolating polynomials ± Next Time ² Splines
Background image of page 2
Quantification of Errors ± For a straight line the sum of the squares of the estimate residuals is: ± The standard error of the estimate: s y / x = S r n 2 S r = e i 2 i = 1 n = y i a 0 a 1 x i () 2 i = 1 n
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Linear regression versus the sample mean ± What is the difference between linear regression and the case when we simply compute the sample mean and draw a line corresponding to the sample mean? ± The spread Æ the histogram of the differences between the values predicted by linear regression and the actual sample values. ± Regression data showing (a) the spread of data around the mean of the dependent data and (b) the spread of the data around the best fit line: ± The reduction in spread represents the improvement due to linear regression.
Background image of page 4
Coefficient of Determination ± The coefficient of determination r 2 Æ ± r 2 represents the percentage of the original uncertainty explained by the model. ± For a perfect fit, S r =0 and r 2 =1. ± If r 2 =0, there is no improvement over simply picking the mean. ± If r 2 <0, the model is worse than simply picking the mean! r 2 = S t S r S t ( ) = = n i i t y y S 1 2
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Example V (m/s) F (N) (y i -a 0 -a 1 x i ) 2 4171 7245 911 30 16699 81837 89180 16044 216118 ix i y i a 0 +a 1 x i (y i - ȳ ) 2 1 10 25 -39.58 380535 2 20 70 155.12 327041 3 30 380 349.82 68579 4 40 550 544.52 8441 5 50 610 739.23 1016 6 60 1220 933.93 334229 7 70 830 1128.63 35391 8 80 1450 1323.33 653066 Σ 360 5135 1808297 F est = − 234.2857 + 19.47024 v S t = y i y () 2 = 1808297 S r = y i a 0 a 1 x i 2 = 216118 s y = 1808297 8 1 = 508.26 s y / x = 216118 8 2 = 189.79 r 2 = 1808297 216118 1808297 = 0.8805 88.05% of the original uncertainty has been explained by the linear model
Background image of page 6
Polynomial least-fit squares ± MATLAB has a built-in function polyfit that fits a least-squares n-th order polynomial to data: ² p = polyfit( x , y , n ) ± x : independent data ± y : dependent data ± n : order of polynomial to fit ± p : coefficients of polynomial f ( x )= p 1 x n + p 2 x n -1 +…+ p n x + p n +1 ± MATLAB’s polyval command can be used to compute a value using the coefficients.
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 25

Lecture22 - Engineering Analysis ENG 3420 Fall 2009 Dan C....

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online