mws_gen_reg_ppt_linear - Linear Regression Major: All...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon
1/10/2010 http://numericalmethods.eng.usf.edu 1 Linear Regression Major: All Engineering Majors Authors: Autar Kaw, Luke Snyder http://numericalmethods.eng.usf.edu Transforming Numerical Methods Education for STEM Undergraduates
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Linear Regression http://numericalmethods.eng.usf.edu
Background image of page 2
http://numericalmethods.eng.usf.edu 3 What is Regression? What is regression? Given n data points ) , ( , ... ), , ( ), , ( 2 2 1 1 n n y x y x y x best fit ) ( x f y = to the data. The best fit is generally based on minimizing the sum of the square of the residuals, r S Residual at a point is ) ( i i i x f y = ε = = n i i i r x f y S 1 2 )) ( ( ) , ( 1 1 y x ) , ( n n y x ) ( x f y = Figure. Basic model for regression Sum of the square of the residuals .
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
http://numericalmethods.eng.usf.edu 4 Linear Regression-Criterion#1 ) , ( , ... ), , ( ), , ( 2 2 1 1 n n y x y x y x Given n data points best fit x a a y 1 0 + = to the data. Does minimizing = n i i 1 ε work as a criterion, where ) ( 1 0 i i i x a a y + = x i i i x a a y 1 0 = 1 1 , y x 2 2 , y x 3 3 , y x n n y x , i i y x , i i i x a a y 1 0 = y Figure. Linear regression of y vs. x data showing residuals at a typical point, x i .
Background image of page 4
http://numericalmethods.eng.usf.edu 5 Example for Criterion#1 x y 2.0 4.0 3.0 6.0 2.0 6.0 3.0 8.0 Example: Given the data points (2,4), (3,6), (2,6) and (3,8), best fit the data to a straight line using Criterion#1 Figure. Data points for y vs. x data. Table. Data Points 0 2 4 6 8 10 0 1 2 3 4 x y
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
http://numericalmethods.eng.usf.edu 6 Linear Regression-Criteria#1 0 4 1 = = i i ε x y y predicted ε = y - y predicted 2.0 4.0 4.0 0.0 3.0 6.0 8.0 -2.0 2.0 6.0 4.0 2.0 3.0 8.0 8.0 0.0 Table. Residuals at each point for regression model y = 4x – 4. Figure. Regression curve for y=4x-4, y vs. x data 0 2 4 6 8 10 0 1 2 3 4 x y Using y=4x-4 as the regression curve
Background image of page 6
http://numericalmethods.eng.usf.edu 7 Linear Regression-Criteria#1 x y y predicted ε = y - y predicted 2.0 4.0 6.0 -2.0 3.0 6.0 6.0 0.0 2.0 6.0 6.0 0.0 3.0 8.0 6.0 2.0 0 4 1 = = i i ε 0 2 4 6 8 10 0 1 2 3 4 x y Table. Residuals at each point for y=6 Figure. Regression curve for y=6, y vs. x data Using y=6 as a regression curve
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
http://numericalmethods.eng.usf.edu 8 Linear Regression – Criterion #1
Background image of page 8
Image of page 9
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 06/12/2011 for the course EML 3041 taught by Professor Kaw,a during the Spring '08 term at University of South Florida - Tampa.

Page1 / 25

mws_gen_reg_ppt_linear - Linear Regression Major: All...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online