Answers to selected problems in chapters 10, 11, 12 and 13
CH 10: MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS
ARE CORRELATED?
10.1
If
X
k
is a perfect linear combination of the remaining explanatory
variables, then there are (
k
1) equations with
k
unknowns. With
more unknowns than equations, unique solutions are not possible.
10.2
(
a
) No. Variable
X
3i
is an exact linear combination of
X
2i
, because
X
3
i
=
2
X
2
i

1.
(
b
) Rewriting the equation yields,
Therefore, we can estimate
α
1
and
α
2
uniquely, but not the original
betas because we have two equations to solve the three unknowns.
10.3
(
a
) Although the numerical values of the intercept and the slope
coefficients of PGNP and FLR have changed, their signs have not.
Also, these variables are still statistically significant. These changes
are due to the addition of the TFR variable, suggesting that there may
be some collinearity among the regressors.
(
b
) Since the
t
value of the TFR coefficient is very significant (
the p
value
is only .0032), it seems TFR belongs in the model. The positive
sign of this coefficient also makes sense in that the larger the number of
children born to a woman, the greater the chances of increased child
mortality.
(
c
) This is one of those “happy” occurrences where despite possible
collinearity, the individual coefficients are still statistically significant.
10.5
(
a
) Yes. Economic time series data tend to move in the same
direction. Here, the lagged variables of income will generally move
in the same direction.
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document(
b
) As discussed briefly in Chapter 10 and further discussed in
Chapter 17, the first difference transformation may alleviate the
problem.
10.10
(
a
) No. Multicollinearity refers to linear association among
variables. Here the association is nonlinear.
(
b
) There is no reason to drop them. They are theoretically as
well as statistically significant in the present example.
(
c
) If one of the variables is dropped, there will be specification bias
that will show up in the coefficient(s) of the remaining variable(s).
10.12
(
a
)
False.
If exact linear relationship(s) exist among variables, we
cannot even estimate the coefficients or their standard errors.
(
b
)
False.
One may be able to obtain one or more significant
t
values.
(
c
)
False.
As noted in the chapter (see Eq. 7.5.6), the variance
an OLS estimator is given by the following formula:
As can be seen from this formula, a high
can be counterbalanced
by a low
or high
.
(
d
)
Uncertain
. If a model has only two regressors, high pairwise
correlation coefficients may suggest multicollinearity. If one or
more regressors enter nonlinearly, the pairwise correlations may
give misleading answers.
(
e
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '10
 YY
 Regression Analysis, Variance, multicollinearity, Heteroscedasticity

Click to edit the document details