*This preview shows
pages
1–12. Sign up to
view the full content.*

This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **Introduction to Probability Introduction to Probability and Statistics and Statistics Thirteenth Edition Thirteenth Edition Chapter 13 Multiple Regression Analysis Introduction Introduction We extend the concept of simple linear regression as we investigate a response y which is affected by several independent variables, x 1 , x 2 , x 3 ,, x k . Our objective is to use the information provided by the x i to predict the value of y. Example Example Let y be a students college achievement, measured by his/her GPA. This might be a function of several variables: x 1 = rank in high school class x 2 = high schools overall rating x 3 = high school GPA 4 Example Example Let y be the monthly sales revenue for a company. This might be a function of several variables: x 1 = advertising expenditure x 2 = time of year x 3 = state of economy 4 Some Questions Some Questions How well does the model fit? How strong is the relationship between y and the predictor variables? Have any assumptions been violated? How good are the estimates and predictions? We collect information using n observations on the response y and the independent variables, x , x , x , x . The General The General Linear Model Linear Model The Random Error The Random Error The deterministic part of the model, E(y) = E(y) = + + x x + + x x ++ ++ x x , , describes average value of y for any fixed Example Example Consider the model E( y ) = + 1 x 1 + 2 x 2 This is a first order model first order model (independent variables appear only to the first power). = y y-intercept-intercept = value of E( y ) when x 1 = x 2 =0. 1 and 2 are the partial regression partial regression coefficients coefficients the change in y for a one- unit change in x i when the other when the other independent variables are held constant independent variables are held constant . The Method of The Method of Least Squares Least Squares The best-fitting prediction equation is calculated using a set of n measurements ( y , x 1 , x 2 , x k ) as We choose our estimates b , b 1 ,, b k to estimate , 1 ,, k to minimize 2 1 1 2 ) ... ( ) ( k k x b x b b y y y---- =- = SSE k k x b x b b y + + + = ... 1 1 Example Example A computer database in a small community contains the listed selling price y (in thousands of dollars), 4 Property y Fit a first order model to the data using the method of least squares. Example Example The first order model is E( y ) = + x + x + x + x fit using Minitab with the values of y and the Regression Analysis: ListPrice versus SqFeet, NumFlrs, Bdrms, Baths The regression equation is ListPrice = 18.8 + 6.27 SqFeet - 16.2 NumFlrs - 2.67 Bdrms + 30.3 Baths Predictor Coef SE Coef T P Constant 18.763 9.207 2.04 0.069 SqFeet 6.2698 0.7252 8.65 0.000 NumFlrs -16.203 6.212 -2.61 0.026 Bdrms -2.673 4.494 -0.59 0.565 Baths 30.271 6.849 4.42 0.001 Partial regression coefficients Regression equation...

View Full
Document