Class 7 - Regression

# Class 7 - Regression - Regression Analysis uses a...

This preview shows pages 1–7. Sign up to view the full content.

Regression Analysis uses a mathematical model to describe the relationship between one variable and one or more other variables. Dependent Variable = f[ Independent Variable(s) ] or Response Variable = f[ Predictor Variable(s) ] or Y Variable = f[ X Variable(s) ] Simple Linear Regression - One independent or predictor variable using a straight line model. Multiple Regression - More than one independent or predictor variable. Other names that are used for the Independent X Variable(s) are Explanatory Variable(s) Regressor(s), Input Variable(s) or Exogenous Variable(s).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Classic Formula for a line (used by Excel for graph trendline) Y = m X + b, m = Slope and b = Y Intercept Phenomenon or Population Linear Regression Notation Y = β 0 + β 1 X + ε (page 439), where β 0 = Y Intercept for the population regression line β 1 = Slope for the population regression line ε = Random error (This error term shows that Y values vary around the population regression line.) σ 2 ε = Variance( ε ) = Variance of the random errors Sample Regression Line for Simple Linear Regression b 0 = Y Intercept for the regression line fitted to the sample data, b 1 = Slope for the regression line fitted to the sample data, Line Fitted to Sample Data X b b Y + = 1 0 ˆ
Mult. Reg. Page 3 Multiple Linear Regression with k variables Phenomenon (population) Model for Y, Y = β 0 + β 1 X 1 + β 2 X 2 + . .. + β k X k + ε (page 514) Sample Linear Regression Model with estimated coefficients Y-hat = b 0 +b 1 X 1 + b 2 X 2 + . .. + b k X k (page 511)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
(also denoted by Y-hat) Y-hat = b 0 + b 1 X (simple model) Y-hat = f[predictor variable(s)] Residual = Y - (Y-hat) = error estimate based on estimated regression model Page 523 SS(Error) = SSE = Sum of Squared Errors = Sum of Squared Residuals SS(Total) = Sum of Squared Deviations of Y values from the sample mean of Y SS(Total) = SST = SS(Y) , (9.10) on page 409 of Canavos & Miller, 1999 SS(Regression) = SSR = Sum of Squares attributable to the regression model SS(Total) = SS(Regression) + SS(Error) SST = SSR + SSE Method of least squares selects the regression model coefficients that minimize the value of SSE for a set of data. (Least Squares Estimates = b j ) Predicted Value of Y = Y ˆ
R-square = R 2 = Coefficient of Determination R-square = Proportion of the total variability that can be explained using the fitted regression model (page 204 & page 523) R-square = SS(regression) / SS(total) R 2 = SSR / SST = (SST - SSE) / SST = 1 - (SSE/SST) 2 1 ) ˆ ( Y Y SSE n i i = - = ( 29 n Y Y Y Y SST n i i n i i n i i 2 1 1 2 1 2 - = - = = = =

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Estimation of the Variance of the Errors. (pages 203, 443 & 512) MSE will be used to denote the sample estimate of error variance for the Y values MSE represents Mean Square Error MSE = SSE / (degrees of freedom error) = SS(residual) / (degrees of freedom residual) Degrees of Freedom Total = df(Total) = n-1 Degrees of Freedom Regression = df(Reg) = number of predictor variables = k Degrees of Freedom Error = df(Error) = df(Total) - df(Regression) = n-k-1 df(Total) = df(Regression) + df(Error) In Excel Regression, Standard Error = Square Root of MSE.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 06/20/2011 for the course MGMT 524 taught by Professor Andrews,r during the Spring '08 term at VCU.

### Page1 / 26

Class 7 - Regression - Regression Analysis uses a...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online