Before, we have used the modelyt=α+β Xt+utt=1,2,…….TThis bivariate model suggests that changes in the dependent variable areexplained by reference to changes in one explanatory variable.But what if our dependent (y) variable depends on more than oneindependent variable? For example, the number of cars sold might plausiblydepend on (i)the price of cars (ii)the price of public transport (iii)the price of petrol (iv)the extent of the public’s concern about global warming.Having just one independent variable is no good in this case - we want tohave more than one xvariable. It is more prudent to examine the effect of all of the explanatory variablestogether on the explained variable in one equation than to use each of thefour proposed explanatory factors in separate regressions.It is very easy to generalise the simple model to one with kregressors(independent variables), giving the following equation:yt=β1+β2X2t+β3X3t+………+βkXkt+utt=1,2,… …TThe variables x2t,x3t……..xktare a set of k−1 explanatory variables whichare thought to influence y, and the coefficient estimates β1, β2,…..,βkare the parameters which quantifythe effect of each of these explanatory variables on y. The coefficient interpretations are slightly altered in the multiple regressioncontext. Each coefficient is now known as a partial regression coefficient,interpreted as representing the partial effect of the given explanatory variableon the explained variable, after holding constant, or eliminating the effect of,all other explanatory variables. For example, ^β2measures the effect of X2on yafter eliminating theeffects of X3, x4,…..XkIn other words, each coefficient measures the average change in thedependent variable per unit change in a given independent variable, holdingall other independent variables constant at their average values.