This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: STA 3024 Introduction to Statistics 2 Chapter 6: Multiple Linear Regression Analysis As stated in chapter 3 and chapter 4, the table below summarizes the major materials that we need to cover Table 1: Methods to Investigate the Association between Variables Explanatory Variable(s) Response Variable Method Chapter 3 Categorical Categorical Contingency Tables Chapter 4 Categorical Quantitative Analysis of Variance (ANOVA) Chapter 5 and 6 Quantitative Quantitative Regression Analysis Quantitative Categorical (not discussed) This chapter deals with cases where both explanatory and response variables are quanti- tative where well use regression analysis to study the association between the two variables. The regression methods that were studying in this chapter restricted to the linear regression family (as opposed to nonlinear regression analysis). If theres only one quantitative explanatory variable, then well study simple linear re- gression . If there are more than one explanatory variables, then well introduce multiple linear regression . This chapter corresponds to chapter 13 in our textbook. 1 PART I - BACKGROUND 1.1 Background and Remarks Sometimes we want to take into account more than one factor to explain/predict an outcome. For example, there is no one factor that can determine NBA salaries but many of them: average point per game, average rebound, steals, etc. Thats why the simple linear regression need to be generalized to multiple linear regression. Multiple linear regression model relates the mean Y of a quantitative response variable Y to a set of p independent explanatory variables X 1 ,X 2 ,...,X p . The multiple linear regression equation for the population is Y = + 1 X 1 + 2 X 2 + ... + p X p . The sample prediction equation is Y = a + b 1 X 1 + b 2 X 2 + ... + b p X p . Here, is a population parameter; it is the Y-intercept, or the value of Y when all of the explanatory variables are 0. Correspond to the population parameter is the sample statistic a which is the estimate of based on the sample data. Furthermore, 1 , 2 ,..., p are the coefficients of the independent variables X 1 ,X 2 ,...,X p (they are population parameters). Correspond to 1 , 2 ,..., p are the sample statistics b 1 ,b 2 ,...,b p which are the estimates of 1 , 2 ,..., p based on the sample data. For any particular combination of values of X 1 ,X 2 ,...,X p , the value of Y is the esti- mate for Y . Note that we are dealing with linear regression. That is, the relationship between Y and each of the explanatory variable X 1 ,X 2 ,...,X p is linear. Example: Linear or Nonlinear? Y = + 1 X 1 + 2 X 2 + ... + p X p ....
View Full Document