j
)
%
U
j
'
(
α
%
γ
.
η
)
%
(
β
%
γ
.
δ
).
X
j
%
U
j
,
j
'
1,2,
.....
,
n
,
(37)
Clearly, only
and
can be estimated
by OLS, but not
separately without
α
%
γ
.
η
β
%
γ
.
δ
α
,
β
and
γ
knowing
η
and
δ
. This case is ruled out by Assumption 1.
6.
Heteroscedasticity
Recall that the errors
of a regression model are heteroskedastic if the conditional
U
j
variance of
given the explanatory variables is not constant, but a function of the explanatory
U
j
variables. In particular, the error terms in model (7) are heteroskedastic if there exists a non
constant function
ψ
such that
E
[
U
2
j

X
1,
j
,
X
2,
j
,...,
X
k
&
1,
j
]
'
ψ
(
X
1,
j
,
X
2,
j
,...,
X
k
&
1,
j
).
(38)
Heteroscedasticity often occurs in practice. It is actually the rule rather than the exception. One of
the problems of heteroscedasticity is that the standard errors and tvalues of the OLS parameter
estimators are no longer valid. However, this problem can easily be cured by replacing the
standard errors (24) with the heteroscedasticity consistent (H.C.) standard errors:
˜
σ
i
'
'
n
j
'
1
w
2
i
,
j
ˆ
U
2
j
(
'
H
.
C
.
standard error of
ˆ
β
i
).
(39)
and the tvalues with the heteroscedasticity consistent (H.C.) tvalues
˜
t
i
'
ˆ
β
i
˜
σ
i
(
'
H
.
C
.
t
&
value of
ˆ
β
i
)..
(40)
The
F
and Wald tests in Proposition 4 are also no longer valid under heteroscedasticity,
but the cure for this is difficult to explain at the undergraduate level. To test joint hypotheses
under heteroscedasticity with EasyReg you have to increase the econometrics level to
“Intermediate”. Then after running your regression you will get the option to conduct Wald tests
of linear parameter restrictions. This option gives you two versions of the Wald test, one for the
homoscedastic case and one for the heteroskedastic case. See the guided tour on OLS estimation.