ICMSS2013FullPaper010220131.doc - Overcoming Multicollinearity in Multiple Regression Using Correlation Coefficient H J Zainodin and S J Yap Mathematics

ICMSS2013FullPaper010220131.doc - Overcoming...

This preview shows page 1 - 2 out of 4 pages.

Overcoming Multicollinearity in Multiple Regression Using Correlation Coefficient H. J. Zainodin and S. J. Yap Mathematics with Economics Programme, School of Science and Technology, Universiti Malaysia Sabah, 88400 Kota Kinabalu, Sabah, Malaysia Abstract. Multicollinearity happens when there are high correlations among independent variables. In this case, it would be difficult to distinguish between the contributions of these independent variables to that of the dependent variable as they may compete to explain much of the similar variance. Besides, the problem of multicollinearity also violates the assumption of multiple regression: that there is no collinearity among the possible independent variables. Thus, an alternative approach is introduced in overcoming the multicollinearity problem in achieving a well represented model eventually. This approach is accomplished by removing the multicollinearity source variables on the basis of the correlation coefficient values based on full correlation matrix. Using the full correlation matrix can facilitate the implementation of Excel function in removing the multicollinearity source variables. It is found that this procedure is easier and time-saving especially when dealing with greater number of independent variables in a model and a large number of all possible models. Hence, in this paper detailed insight of the procedure is shown, compared and implemented. Keywords: Multicollinearity, Well Represented Model, Multiple Regression, Model Building Procedures. PACS: 01.40.gb, 01.40.gf INTRODUCTION According to Kutner et al. (2005), multicollinearity happens when there are high correlations among independent variables. In this case, these independent variables may compete to explain much of the similar variance and it would be difficult to distinguish their contributions to the dependent variable. Thus, according to Zainodin et al. (2011), multicollinearity test is included in the model building procedures of multiple regression to remove multicollinearity source variables from each of the possible models. Although there are a number of methods that can be used to attenuate the problem of multicollinearity, only the Zainodin-Noraini (Z-N) multicollinearity remedial method that proposed by Zainodin et al. (2011) is discussed in detail in this paper because a modified method on this method, which is more efficient and time-saving is proposed in this paper. MULTICOLLINEARITY Gujarati and Porter (2009) stated that the term multicollinearity was first used by Ragnar Frisch. Originally, multicollinearity meant the existence of an exact or perfect linear relationship among some or all independent variables of a regression model. However, the term multicollinearity is nowadays utilized in a broader sense to include both the cases of perfect and less than perfect multicollinearity. If perfect multicollinearity occurs, the regression coefficients of the independent variables are indeterminate and their standard errors are infinite. If less than perfect
Image of page 1
Image of page 2

You've reached the end of your free preview.

Want to read all 4 pages?

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture