39 categorical explanatory variables in this section

This preview shows page 22 - 24 out of 48 pages.

3.9 Categorical explanatory variables In this section, we indicate how to code categorical explanatory variables into binary dummy variables to use for multiple regression. A categorical variable with m categories ( m 2) is converted into m - 1 binary variables, where one category is chosen as a baseline category and the m - 1 binary variables represent di ff erences of the other categories relative to the baseline. The selection of the baseline category is not unique. To show the main ideas, first consider a multiple regression with one continuous and one categorical explanatory variable. If the categorical variable has two categories (e.g., female and male), define z i = 1 if category 2 for i th case , 0 if category 1 for ith case . (3.109) In this case, category 1 is considered as the baseline category. The data are converted ( y i , x i , z i ), i = 1 , . . . , n . The regression equation becomes Y i = μ Y ( x i , z i ) + i , where μ Y ( x i , z i ) = β 0 + β 1 x i + β 2 z i = β 0 + β 1 x i if category 1 , β 0 + β 1 x i + β 2 = ( β 0 + β 2 ) + β 1 x i if category 2 . (3.110) This implies a model where the relation of y with x is linear for both categories and there is a common slope. So on a scatterplot, the data for the two categories should lie roughly on parallel lines. β 2 is interpreted as the separation distance of the two lines. If the scatterplot shows linear relationships with di ff erent slopes for the two categories, then for multiple regression, use converted ( y i , x i , z i , x i z i ), i = 1 , . . . , n . The regression equation becomes Y i = μ * Y ( x i , z i ) + i , where μ * Y ( x i , z i ) = β 0 + β 1 x i + β 2 z i + β 3 x i z i (3.111) = β 0 + β 1 x i if category 1 , β 0 + β 1 x i + β 2 + β 3 x i = ( β 0 + β 2 ) + ( β 1 + β 3 ) x i if category 2 . (3.112) Hence β 3 is interpreted as the di ff erence in slope for category 2 versus category 1. Is there a simple in- terpretation for β 2 in this case? The product x i z i is an example of what is called an interaction term in multiple regression. Interaction terms involving products of other explanatory variables indicate that the two variables do not influence the mean value of the response in an additive manner. 58
With two predictors of which one is continuous and the other is binary, the regression lines can be shown in a plot. See, for example, the Figure 3.6. For categorical variable with m categories, create m - 1 binary dummy variables z i 2 , . . . , z im where z ij = 1 if category j for i th case , 0 otherwise , for j = 2 , . . . , m. (3.113) Here, category 1 is considered as the baseline category. If the i th observation is in category 1, then ( z i 2 , . . . , z im ) = (0 , . . . , 0). If the i th observation is in category 2, then ( z i 2 , . . . , z im ) = (1 , 0 , . . . , 0). If the i th observation is in category 3, then ( z i 2 , . . . , z im ) = (0 , 1 , 0 , . . . , 0), etc. If the i th observation is in cat- egory m , then ( z i 2 , . . . , z im ) = (0 , . . . , 0 , 1). The regression equation becomes Y i = μ Y ( x i , z i 2 , . . . , z im ) + i , where μ Y ( x i , z i 2 , . . . , z im ) = β 0 + β 1 x i + β 2 z i 2 + · · · + β m z im (3.114) = β 0 + β 1 x i if category 1 , ( β 0 + β 2 ) + β 1 x i if category 2 , ( β 0 + β 3 ) + β 1 x i if category 3 , · · · etc.

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture