Cov u i u j The errors are statistically independent of one another 4 Cov u t x

Cov u i u j the errors are statistically independent

This preview shows page 33 - 35 out of 46 pages.

Cov(ui,uj)=0The errors are statistically independent of oneanother4.Cov(ut,xt)=0No relationship between the error andcorresponding xvariateThe Assumptions Underlying the CLRM AgainAn alternative assumption to 4., which is slightly stronger, is that the xt’sare nonstochastic or fixed in repeated samples.A fifth assumption is required if we want to make inferences about thepopulation parameters (the actualαandβ) from the sampleparameters (^αand ^β)Additional Assumption 5. utis normally distributedProperties of the OLS EstimatorIf assumptions 1. through 4. hold, then the estimators and determined by OLSare known as Best Linear Unbiased Estimators (BLUE). What does theacronym stand for?“Estimator” - ^βis an estimator of the true value of β.“Linear”-^βis a linear estimator “Unbiased”- On average, the actual value of the ^αand ^β’s will be equalto the true values.“Best”- means that the OLS estimator ^βhas minimum variance among theclass of linear unbiased estimators. The Gauss-Markov theorem proves thatthe OLS estimator is best.**** Insert example and solution of table to calculate OLS estimates5.Multiple linear regression model33
Before, we have used the modelyt=α+β Xt+utt=1,2,…….TThis bivariate model suggests that changes in the dependent variable areexplained by reference to changes in one explanatory variable.But what if our dependent (y) variable depends on more than oneindependent variable? For example, the number of cars sold might plausiblydepend on (i)the price of cars (ii)the price of public transport (iii)the price of petrol (iv)the extent of the public’s concern about global warming.Having just one independent variable is no good in this case - we want tohave more than one xvariable. It is more prudent to examine the effect of all of the explanatory variablestogether on the explained variable in one equation than to use each of thefour proposed explanatory factors in separate regressions.It is very easy to generalise the simple model to one with kregressors(independent variables), giving the following equation:yt=β1+β2X2t+β3X3t+………+βkXkt+utt=1,2,… …TThe variables x2t,x3t……..xktare a set of k−1 explanatory variables whichare thought to influence y, and the coefficient estimates β1, β2,…..kare the parameters which quantifythe effect of each of these explanatory variables on y. The coefficient interpretations are slightly altered in the multiple regressioncontext. Each coefficient is now known as a partial regression coefficient,interpreted as representing the partial effect of the given explanatory variableon the explained variable, after holding constant, or eliminating the effect of,all other explanatory variables. For example, ^β2measures the effect of X2on yafter eliminating theeffects of X3, x4,…..XkIn other words, each coefficient measures the average change in thedependent variable per unit change in a given independent variable, holdingall other independent variables constant at their average values.

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture