Finals-Review-2005

Finals-Review-2005 - EAD 115 Numerical Solution of...

Info iconThis preview shows pages 1–18. Sign up to view the full content.

View Full Document Right Arrow Icon
EAD 115 Numerical Solution of Engineering and Scientific Problems David M. Rocke Department of Applied Science
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Least Squares Regression
Background image of page 2
Curve Fitting • Given a set of n points (x i , y i ), find a fitted curve that provides a fitted value y = f(x) for each value of x in a range. • The curve may interpolate the points (go through each one), either linearly or nonlinearly, or may approximate the points without going through each one, as in least-squares regression.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 4
Simple Linear Regression • We have a set of n data points, each of which has a measured predictor x and a measured response y. • We wish to develop a prediction function f(x) for y. • In the simplest case, we take f(x) to be a linear function of x, as in f(x) = a 0 + a 1 x
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 6
• The least-squares criterion minimizes • There are many other possible criteria. Use of the least-squares criterion does not imply any beliefs about the data • Use of the linear form for f(x) assumes that this straight-line relationship is reasonable • Assumptions are needed for inference about the predictions or about the relationship itself () 2 2 11 nn ii i SS r y f x == ∑∑
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Computing the Least-Squares Solution • We wish to minimize the sum of squares of deviations from the regression line by choosing the coefficients a 0 and a 1 accordingly • Since this is a continuous, quadratic function of the coefficients, one can simply set the partial derivatives equal to zero
Background image of page 8
() 2 2 2 01 0 1 11 1 0 1 1 1 0 2 0 1 1 1 1 1 (,) 02 2 2 nn n ii i i i i n n i i i i n n i i i i i i i i i SS a a r y f x y a a x SS a a y a ax y a a SS a a ya a x x x y a x a x a na a x == = = = = = = = ⎡⎤ = ⎢⎥ ⎣⎦ = + ∑∑ 1 2 1 i i n i i i y x y = = = −=
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
•T h e s e normal equations have a unique solution as two equations in two unknowns • The straight line that is calculated in this way is used in practice to see if there is a relationship between x and y • It is also used to predict y from x • It can also be used to predict x from y by inverting the equation
Background image of page 10
Measuring Variation • If we do not use any predictor, the variability of y is its variance, or mean square difference between y and the mean of all the y’s. • If we use a predictor, then the variability is the mean square difference between y and its prediction
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Background image of page 12
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
() ( ) ( ) 12 1 1 2 1 01 1 1 ˆ 2 2 /1 n i i n ii i n i MST n y y MSE n y y ny a a x MSR SST SSE = = = =−
Background image of page 14
Multiple Regression • If we have more than one predictor, we can still fit the least-squares equations so long as we don’t have more coefficients than data points • This involves solving the normal equations as a matrix equation
Background image of page 15

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
number of data points number of predictors including constant is 1 1 is 1 B is p 1 is 1 1 is n 1 is is n 1 yx B YX B E n p y xp Y Xn p E ε =+ = = × × × × × × ×
Background image of page 16
Linearization of Nonlinear Relationships • We can fit a curved relationship with a polynomial • The relationship f(x) = a 0 + a 1 x + a 2 x 2 can be treated as a problem with two predictors • This can then be dealt with as any multiple regression problem
Background image of page 17

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 18
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 185

Finals-Review-2005 - EAD 115 Numerical Solution of...

This preview shows document pages 1 - 18. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online