MBA
349EN160.pdf

# K k x x x 177 the parameter 01 j j k is also called

• 256

This preview shows pages 182–185. Sign up to view the full content.

k k x x x

This preview has intentionally blurred sections. Sign up to view the full version.

177 The parameter 0,1, .... j j k is also called partial regression coefficient because it measures the expected change in response variable y per unit change in j x when all remaining independent variable i x i j are held constant. 18.3.3.1. Estimating Parameters of Multiple Regression Model Multiple regression analysis requires that we obtain sample data calculate values of the unknown parameters 0 , 1 ……. k by fitting the model to the data. To fit the general linear multiple regression model using method of lest squares, we choose the estimated regression model. E y or 0 1 1 2 2 ..... k k y b b x b x b x That minimize the sum of squares errors (SSE)= 2 i i y y , where i y and 1,2, .... i y i k represent the observed and estimated (or predicted) value of the dependent variable for the observation. The terms 0,1,2, .... j b i k are the least- square estimates of population regression parameter j . Estimation: The Method of Least Squares The method of least squares compute the values of regression coefficients( or parameters) a and b and estimating the values of the dependent variable y in a linear regression model can also be extended for estimating the unknown parameters 0 1 , , .... k based on sample data. The least squares estimators of these parameters are denoted by 0 1 , ,... k b b b respectively. Given these values, the least square multiple regression equation can be written as: 1 1 2 2 ..... k k y a b x b x b x Where y = estimated value of dependent variable y a = y -intercept 1 2 , x x = independent variable j b = slope associated with variable j x 0,1, .... j k to visualize a multiple regression model, consider the following regression equation involving two independent variables 1 x and 2 x a dependent variable y : 1 1 2 2 y a b x b x
178 As an illustrative example, let us consider that the sales performance (Y i ) by salesman is dependent on the aptitude test score (X 1i ) and also on the ‘index of effort’ expended by them (X 2i ). Accordingly, for each salesman, data on sales performance, aptitude test score and index of effort are gathered for the multiple regression analysis. The sample data gathered are as given in Table. In multiple regression, the linear relationship with two independent variable is given by 1 1 2 2 i i i i Y a b X b X e For more than two independent variables, additional terms may be added. The terms b 1 and b 2 are called partial regression coefficients. The least squares method minimizes the error sum of squares, that is, 2 2 i i e Y Y is minimized. By using calculus, the minimization reduces to the following three normal equations.

This preview has intentionally blurred sections. Sign up to view the full version.

This is the end of the preview. Sign up to access the rest of the document.
• Spring '12
• abc

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern