Regression_multiple_variables - y = 10 + 2 x 1 + 3 x 2- x 3...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
Linear Regression with Multiple Variables J. M. Cimbala The following table is created using this function: y 0 0 0 10 0 -1 0 7 0 1 -1 14 -1 0 -1 9 -1 -1 1 4 -1 1 1 10 1 0 0 12 1 -1 -1 10 1 1 1 14 SUMMARY OUTPUT Regression Statistics Multiple R 1 R Square 1 Adjusted R Square 1 Standard Error 0 Observations 9 ANOVA df SS MS F Significance F Regression 3 82 27.33 6.08E+032 0 Residual 5 0 0 Total 8 82 Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0% Intercept 10 0 1.41E+017 0 10 10 10 10 X Variable 1 2 0 2.28E+016 0 2 2 2 2 X Variable 2 3 0 3.41E+016 0 3 3 3 3 X Variable 3 -1 0 -1.12E+016 0 -1 -1 -1 -1 Discussion: So, we write the best-fit equation as: [This agrees with the equation we started with, verifying our procedure.] Consider the function
Background image of page 1
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: y = 10 + 2 x 1 + 3 x 2- x 3 x 1 x 2 x 3 Now a linear regression is performed with y as a function of all three independent variables: (Tools-Data Analysis-Regression ) In Office 2007 - Data-Data Analysis (in Anaylsis area)-Regression . "Intercept" is the y-intercept. "X Variable 1" is the slope of y with respect to the first variable x 1 . y = 10 + 2 x 1 + 3 x 2- x 3 "X Variable 2" is the slope of y with respect to the first variable x 2 . "X Variable 3" is the slope of y with respect to the first variable x 3 ....
View Full Document

This note was uploaded on 07/23/2008 for the course ME 345 taught by Professor Staff during the Spring '08 term at Pennsylvania State University, University Park.

Ask a homework question - tutors are online