This preview shows page 1. Sign up to view the full content.
Unformatted text preview: ECON 103, Lecture 10: Multiple Regression and
Testing
Maria Casanova May 3 (version 1) Maria Casanova Lecture 10 1. Introduction General Multiple Regression Model:
Yi = β0 + β1 X1i + β2 X2i + ... + βk Xki + ui
We have estimated the parameters β0 , β1 , ..., βk .
Now, we will study
whether the estimated coeﬃcients are statistically signiﬁcant.
whether certain relationships between parameters hold.
whether a group of parameters are jointly signiﬁcant. Maria Casanova Lecture 10 1. Introduction Outline:
Hypothesis tests for a single coeﬃcient
For example H0 : β2 = 0 or H0 : β3 = 1 Hypothesis tests regarding multiple coeﬃcients
For example H0 : β1 = β2 = β4 = 0 or H0 : β5 = 2β3 Example  the return to education Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Testing a hypothesis about a single coeﬃcient is done in the same way as
in the case of the simple regression.
The CLT tells us that ˆ
βj − βj
∼ N (0, 1)
ˆ
SE (βj ) ˆ
In practice, we have to estimate SE (βj ), which in turn depends on σu ,
ˆ2
the estimated variance of the error term.
We have n − k − 1 degrees of freedom to estimate this variance, thus an
2
unbiased estimator of σu is
σu =
ˆ2 1
n−k −1 Maria Casanova ui2
ˆ
i Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Testing a hypothesis about a single coeﬃcient is done in the same way as
in the case of the simple regression.
The CLT tells us that ˆ
βj − βj
∼ N (0, 1)
ˆ
SE (βj ) ˆ
ˆ2
In practice, we have to estimate SE (βj ), which in turn depends on σu ,
the estimated variance of the error term.
We have n − k − 1 degrees of freedom to estimate this variance, thus an
2
unbiased estimator of σu is
σu =
ˆ2 1
n−k −1 Maria Casanova ui2
ˆ
i Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Testing a hypothesis about a single coeﬃcient is done in the same way as
in the case of the simple regression.
The CLT tells us that ˆ
βj − βj
∼ N (0, 1)
ˆ
SE (βj ) ˆ
ˆ2
In practice, we have to estimate SE (βj ), which in turn depends on σu ,
the estimated variance of the error term.
We have n − k − 1 degrees of freedom to estimate this variance, thus an
2
unbiased estimator of σu is
σu =
ˆ2 1
n−k −1 Maria Casanova ui2
ˆ
i Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Testing a hypothesis about a single coeﬃcient is done in the same way as
in the case of the simple regression.
The CLT tells us that ˆ
βj − βj
∼ N (0, 1)
ˆ
SE (βj ) ˆ
ˆ2
In practice, we have to estimate SE (βj ), which in turn depends on σu ,
the estimated variance of the error term.
We have n − k − 1 degrees of freedom to estimate this variance, thus an
2
unbiased estimator of σu is
σu =
ˆ2 1
n−k −1 Maria Casanova ui2
ˆ
i Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient As a consequence of this, for a small n, the distribution of the
ˆ
standardized βj is a t with n − k − 1 degrees of freedom:
ˆ
βj − βj
∼ tn−k −1
ˆ
SE (βj ) The t distribution converges to a normal when n is large.
ˆ
βj − βj
∼ N (0, 1)
ˆ
SE (βj ) Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient As a consequence of this, for a small n, the distribution of the
ˆ
standardized βj is a t with n − k − 1 degrees of freedom:
ˆ
βj − βj
∼ tn−k −1
ˆ
SE (βj ) The t distribution converges to a normal when n is large.
ˆ
βj − βj
∼ N (0, 1)
ˆ
SE (βj ) Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Steps for hypothesis testing:
1 Formulate the null hypothesis (e.g. H0 : βj = 0) 2 Formulate the alternative hypothesis, either as twosided
(H1 : βj = 0) or as onesided (H1 : βj < 0 or H1 : βj > 0) 3 Specify the level of signiﬁcance α (e.g. α = 0.05) 4 Calculate the actual value of the t statistic under the null. 5 Compute the critical value according to the signiﬁcance level α. 6 Decide whether you can or cannot reject the null hypothesis. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Test scores example:
Test Scoresi = β0 + β1 STRi 1 + β2 PctLEi 2 + ui Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
=
= −2.56
t act =
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
=
= −2.56
t act =
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
=
= −2.56
t act =
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
=
= −2.56
t act =
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
=
= −2.56
t act =
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
t act =
=
= −2.56
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
t act =
=
= −2.56
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
t act =
=
= −2.56
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
The estimated regression line is:
Test Scoresi = 686.0 − 1.10 STRi − 0.65 PctLEi
(8.73)
(0.43)
(0.03)
We are interested in whether the STR aﬀects test scores.
1 H 0 : β1 = 0 2 H 1 : β1 = 0 3 α = 5% 4 The actual value of the t statistic is
ˆ
β1 − β1,0
−1.10 − 0
t act =
=
= −2.56
0.43
ˆ
SE β1 5 The critical values are zα/2 = −1.96 and z1−α/2 = 1.96 6 We reject the null hypothesis. β1 is negative and statistically
signiﬁcant. The STR has a negative and statistically signiﬁcant
eﬀect on test scores. Maria Casanova
Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
We can create a conﬁdence interval for β1 , as before.
For example, a 95% conﬁdence interval for β1 is given by:
ˆ
ˆˆ
ˆ
β1 − 1.96 × SE (β1 ), β1 + 1.96 × SE (β1 ) In our example, this conﬁdence interval (also reported by Stata) is:
(−1.95, −0.25) We can also use the p value to conduct the test.
The p value is 0.011 < 0.05 = α, therefore we reject the null hypothesis
at the 5% level.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
We can create a conﬁdence interval for β1 , as before.
For example, a 95% conﬁdence interval for β1 is given by:
ˆ
ˆˆ
ˆ
β1 − 1.96 × SE (β1 ), β1 + 1.96 × SE (β1 ) In our example, this conﬁdence interval (also reported by Stata) is:
(−1.95, −0.25) We can also use the p value to conduct the test.
The p value is 0.011 < 0.05 = α, therefore we reject the null hypothesis
at the 5% level.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
We can create a conﬁdence interval for β1 , as before.
For example, a 95% conﬁdence interval for β1 is given by:
ˆ
ˆˆ
ˆ
β1 − 1.96 × SE (β1 ), β1 + 1.96 × SE (β1 ) In our example, this conﬁdence interval (also reported by Stata) is:
(−1.95, −0.25) We can also use the p value to conduct the test.
The p value is 0.011 < 0.05 = α, therefore we reject the null hypothesis
at the 5% level.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
We can create a conﬁdence interval for β1 , as before.
For example, a 95% conﬁdence interval for β1 is given by:
ˆ
ˆˆ
ˆ
β1 − 1.96 × SE (β1 ), β1 + 1.96 × SE (β1 ) In our example, this conﬁdence interval (also reported by Stata) is:
(−1.95, −0.25) We can also use the p value to conduct the test.
The p value is 0.011 < 0.05 = α, therefore we reject the null hypothesis
at the 5% level.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
We can create a conﬁdence interval for β1 , as before.
For example, a 95% conﬁdence interval for β1 is given by:
ˆ
ˆˆ
ˆ
β1 − 1.96 × SE (β1 ), β1 + 1.96 × SE (β1 ) In our example, this conﬁdence interval (also reported by Stata) is:
(−1.95, −0.25) We can also use the p value to conduct the test.
The p value is 0.011 < 0.05 = α, therefore we reject the null hypothesis
at the 5% level.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Another example with dummy variables.
Remember that in this case we had to choose an excluded group.
It turns out that in our LAwages data set there are 4 individuals who are
recorded as both Hispanic and Black. Let’s drop these individuals.
In Stata type
drop if black==1 & hispanic==1
regress wage hispanic black, robust
We obtain results that are very similar to those in the last lecture:
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
where ‘Other’ is the excluded group.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
Let’s now choose ‘Hispanic’ as the excluded group: Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
With ‘Other’ as the excluded group we have
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
With ‘Hispanic’ as the excluded group the estimated regression is
Wagei = 10.16 + 1.09 Blacki + 5.94 Otheri
(0.83)
(1.12)
(1.12)
These regressions are really the same. They both say that average wage
for Others is 16.09, average wage for Blacks is 11.44, and average wages
for Hispanics is 10.15.
But because the variables are coded diﬀerently, the coeﬃcients measure
diﬀerent aspects of the relationship, so the t statistics on the coeﬃcients
test diﬀerent things.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
With ‘Other’ as the excluded group we have
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
With ‘Hispanic’ as the excluded group the estimated regression is
Wagei = 10.16 + 1.09 Blacki + 5.94 Otheri
(0.83)
(1.12)
(1.12)
These regressions are really the same. They both say that average wage
for Others is 16.09, average wage for Blacks is 11.44, and average wages
for Hispanics is 10.15.
But because the variables are coded diﬀerently, the coeﬃcients measure
diﬀerent aspects of the relationship, so the t statistics on the coeﬃcients
test diﬀerent things.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
With ‘Other’ as the excluded group we have
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
With ‘Hispanic’ as the excluded group the estimated regression is
Wagei = 10.16 + 1.09 Blacki + 5.94 Otheri
(0.83)
(1.12)
(1.12)
These regressions are really the same. They both say that average wage
for Others is 16.09, average wage for Blacks is 11.44, and average wages
for Hispanics is 10.15.
But because the variables are coded diﬀerently, the coeﬃcients measure
diﬀerent aspects of the relationship, so the t statistics on the coeﬃcients
test diﬀerent things.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient
With ‘Other’ as the excluded group we have
Wagei = 16.09 − 5.94 Hispanici − 4.65 Blacki
(0.75)
(1.12)
(1.03)
With ‘Hispanic’ as the excluded group the estimated regression is
Wagei = 10.16 + 1.09 Blacki + 5.94 Otheri
(0.83)
(1.12)
(1.12)
These regressions are really the same. They both say that average wage
for Others is 16.09, average wage for Blacks is 11.44, and average wages
for Hispanics is 10.15.
But because the variables are coded diﬀerently, the coeﬃcients measure
diﬀerent aspects of the relationship, so the t statistics on the coeﬃcients
test diﬀerent things.
Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient In the ﬁrst regression, the coeﬃcient on the variable Blacki
measures the diﬀerence between average wages for Blacks and
average wages for Others (since ‘Other’ is the excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 tests whether
Blacks and Others have equal average wages. On the other hand, in the second regression, the coeﬃcient on the
variable Blacki measures the diﬀerence between average wages for
Blacks and average wages for Hispanics (since ‘Hispanic’ is now the
excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 now tests
whether Blacks and Hispanics have equal average wages. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient In the ﬁrst regression, the coeﬃcient on the variable Blacki
measures the diﬀerence between average wages for Blacks and
average wages for Others (since ‘Other’ is the excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 tests whether
Blacks and Others have equal average wages. On the other hand, in the second regression, the coeﬃcient on the
variable Blacki measures the diﬀerence between average wages for
Blacks and average wages for Hispanics (since ‘Hispanic’ is now the
excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 now tests
whether Blacks and Hispanics have equal average wages. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient In the ﬁrst regression, the coeﬃcient on the variable Blacki
measures the diﬀerence between average wages for Blacks and
average wages for Others (since ‘Other’ is the excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 tests whether
Blacks and Others have equal average wages. On the other hand, in the second regression, the coeﬃcient on the
variable Blacki measures the diﬀerence between average wages for
Blacks and average wages for Hispanics (since ‘Hispanic’ is now the
excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 now tests
whether Blacks and Hispanics have equal average wages. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient In the ﬁrst regression, the coeﬃcient on the variable Blacki
measures the diﬀerence between average wages for Blacks and
average wages for Others (since ‘Other’ is the excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 tests whether
Blacks and Others have equal average wages. On the other hand, in the second regression, the coeﬃcient on the
variable Blacki measures the diﬀerence between average wages for
Blacks and average wages for Hispanics (since ‘Hispanic’ is now the
excluded group).
Hence the t statistic for the hypothesis that βBlack = 0 now tests
whether Blacks and Hispanics have equal average wages. Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient Note that the ﬁrst regression doesn’t directly test whether Blacks and
Hispanics have equal average wages, and the second regression doesn’t
directly test whether Blacks and Others have equal average wages.
Therefore, you may want to choose which group to “exclude” based on
what hypotheses you want to test.
Or run both regressions, even though they are really the same thing.
Or you can use an F statistic (next slides). Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient Note that the ﬁrst regression doesn’t directly test whether Blacks and
Hispanics have equal average wages, and the second regression doesn’t
directly test whether Blacks and Others have equal average wages.
Therefore, you may want to choose which group to “exclude” based on
what hypotheses you want to test.
Or run both regressions, even though they are really the same thing.
Or you can use an F statistic (next slides). Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient Note that the ﬁrst regression doesn’t directly test whether Blacks and
Hispanics have equal average wages, and the second regression doesn’t
directly test whether Blacks and Others have equal average wages.
Therefore, you may want to choose which group to “exclude” based on
what hypotheses you want to test.
Or run both regressions, even though they are really the same thing.
Or you can use an F statistic (next slides). Maria Casanova Lecture 10 2. Hypothesis Tests for a Single Coeﬃcient Note that the ﬁrst regression doesn’t directly test whether Blacks and
Hispanics have equal average wages, and the second regression doesn’t
directly test whether Blacks and Others have equal average wages.
Therefore, you may want to choose which group to “exclude” based on
what hypotheses you want to test.
Or run both regressions, even though they are really the same thing.
Or you can use an F statistic (next slides). Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients We might be interested in testing
whether education and experience are jointly important for wages.
whether the return to education is the same for men and women.
The t test procedure is valuable for testing statistical signiﬁcance of an
individual regression coeﬃcient (or a linear combination of coeﬃcients).
However, the ttest procedure is not valid for testing joint hypotheses.
In order to test a hypothesis involving multiple coeﬃcients, we need to
use the F statistic. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients We might be interested in testing
whether education and experience are jointly important for wages.
whether the return to education is the same for men and women.
The t test procedure is valuable for testing statistical signiﬁcance of an
individual regression coeﬃcient (or a linear combination of coeﬃcients).
However, the ttest procedure is not valid for testing joint hypotheses.
In order to test a hypothesis involving multiple coeﬃcients, we need to
use the F statistic. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients We might be interested in testing
whether education and experience are jointly important for wages.
whether the return to education is the same for men and women.
The t test procedure is valuable for testing statistical signiﬁcance of an
individual regression coeﬃcient (or a linear combination of coeﬃcients).
However, the ttest procedure is not valid for testing joint hypotheses.
In order to test a hypothesis involving multiple coeﬃcients, we need to
use the F statistic. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients We might be interested in testing
whether education and experience are jointly important for wages.
whether the return to education is the same for men and women.
The t test procedure is valuable for testing statistical signiﬁcance of an
individual regression coeﬃcient (or a linear combination of coeﬃcients).
However, the ttest procedure is not valid for testing joint hypotheses.
In order to test a hypothesis involving multiple coeﬃcients, we need to
use the F statistic. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Let’s consider the following example:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi ,
How do we interpret the coeﬃcients?
β0 is the intercept for males.
β0 + β1 is the intercept for females.
β2 is the return to education for males.
β2 + β3 is the return to education for females.
A joint hypothesis would be that wages of men and women are the same:
H0 : β1 = 0 and β3 = 0 vs H1 : β1 = 0 and/or β3 = 0 How do we test this hypothesis?
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
This is the intuition behind the procedure to test joint hypotheses:
We start by estimating the full (unconstrained ) model (as in the
previous slide).
We then estimate the constrained model, which incorporates the
restrictions that are true under H0 (in this case, β1 = 0 and β3 = 0).
The RSS for the constrained model is always larger than for the
unconstrained model for 2 reasons:
1 The RSS always increases when variables are dropped from the
model =⇒ This is an algebraic fact. 2 If X1 and X3 contribute to explain Y (i.e. β1 and/or β3 = 0), the
RSS increases when we drop them from the model. The idea is to measure how much of the increase in the RSS is due
to the restrictions. If a lot, we will reject the null hypothesis.
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Formally, we follow these steps:
1 H0 : β1 = 0 and β3 = 0 2 H1 : β1 = 0 or β3 = 0. This means that at least one of the two
coeﬃcient is diﬀerent from zero, or that “β1 and β3 are jointly
signiﬁcant ”. 3 α = 5%. 4 Estimate the unconstrained model:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi 5 Compute the RSS for the unconstrained model (RSSU )
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Formally, we follow these steps:
1 H0 : β1 = 0 and β3 = 0 2 H1 : β1 = 0 or β3 = 0. This means that at least one of the two
coeﬃcient is diﬀerent from zero, or that “β1 and β3 are jointly
signiﬁcant ”. 3 α = 5%. 4 Estimate the unconstrained model:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi 5 Compute the RSS for the unconstrained model (RSSU )
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Formally, we follow these steps:
1 H0 : β1 = 0 and β3 = 0 2 H1 : β1 = 0 or β3 = 0. This means that at least one of the two
coeﬃcient is diﬀerent from zero, or that “β1 and β3 are jointly
signiﬁcant ”. 3 α = 5%. 4 Estimate the unconstrained model:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi 5 Compute the RSS for the unconstrained model (RSSU )
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Formally, we follow these steps:
1 H0 : β1 = 0 and β3 = 0 2 H1 : β1 = 0 or β3 = 0. This means that at least one of the two
coeﬃcient is diﬀerent from zero, or that “β1 and β3 are jointly
signiﬁcant ”. 3 α = 5%. 4 Estimate the unconstrained model:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi 5 Compute the RSS for the unconstrained model (RSSU )
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Formally, we follow these steps:
1 H0 : β1 = 0 and β3 = 0 2 H1 : β1 = 0 or β3 = 0. This means that at least one of the two
coeﬃcient is diﬀerent from zero, or that “β1 and β3 are jointly
signiﬁcant ”. 3 α = 5%. 4 Estimate the unconstrained model:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi 5 Compute the RSS for the unconstrained model (RSSU )
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Formally, we follow these steps:
1 H0 : β1 = 0 and β3 = 0 2 H1 : β1 = 0 or β3 = 0. This means that at least one of the two
coeﬃcient is diﬀerent from zero, or that “β1 and β3 are jointly
signiﬁcant ”. 3 α = 5%. 4 Estimate the unconstrained model:
Wagei = β0 + β1 Femalei + β2 Educi + β3 Fem Educi + εi 5 Compute the RSS for the unconstrained model (RSSU )
Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
6 Estimate the constrained model:
Wagei = β0 + β2 Educi + εi 7 Compute the RSS for the constrained model (RSSC ).
Note that RSSC is always larger than RSSU . 8 The Fstatistic follows the F distribution with (m, n − k − 1) degrees
of freedom:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) ∼ Fm,n−k −1 , where:
m = number of linear restrictions
k = number of regressors in the unconstrained model. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
6 Estimate the constrained model:
Wagei = β0 + β2 Educi + εi 7 Compute the RSS for the constrained model (RSSC ).
Note that RSSC is always larger than RSSU . 8 The Fstatistic follows the F distribution with (m, n − k − 1) degrees
of freedom:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) ∼ Fm,n−k −1 , where:
m = number of linear restrictions
k = number of regressors in the unconstrained model. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
6 Estimate the constrained model:
Wagei = β0 + β2 Educi + εi 7 Compute the RSS for the constrained model (RSSC ).
Note that RSSC is always larger than RSSU . 8 The Fstatistic follows the F distribution with (m, n − k − 1) degrees
of freedom:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) ∼ Fm,n−k −1 , where:
m = number of linear restrictions
k = number of regressors in the unconstrained model. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
6 Estimate the constrained model:
Wagei = β0 + β2 Educi + εi 7 Compute the RSS for the constrained model (RSSC ).
Note that RSSC is always larger than RSSU . 8 The Fstatistic follows the F distribution with (m, n − k − 1) degrees
of freedom:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) ∼ Fm,n−k −1 , where:
m = number of linear restrictions
k = number of regressors in the unconstrained model. Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients 9 10 Compute the actual value of the F statistic, F act .
Find the critical value for the Fm,n−k −1 distribution from the tables
α
(let’s call it Fm,n−k −1 ). Remember:
In the book you will ﬁnd tables for the Fn1 /n2 distribution. In our
notation, n1 = m and n2 = n − k − 1.
You will ﬁnd separate tables for diﬀerent signiﬁcance levels (1%, 5%
and 10%).
When n is large (typically n − k − 1 > 120), the F statistic has a
distribution Fm,∞ , which can also be found on the tables. 11 α
Reject if F act > Fm,n−k −1 . Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients 9 10 Compute the actual value of the F statistic, F act .
Find the critical value for the Fm,n−k −1 distribution from the tables
α
(let’s call it Fm,n−k −1 ). Remember:
In the book you will ﬁnd tables for the Fn1 /n2 distribution. In our
notation, n1 = m and n2 = n − k − 1.
You will ﬁnd separate tables for diﬀerent signiﬁcance levels (1%, 5%
and 10%).
When n is large (typically n − k − 1 > 120), the F statistic has a
distribution Fm,∞ , which can also be found on the tables. 11 α
Reject if F act > Fm,n−k −1 . Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients 9 10 Compute the actual value of the F statistic, F act .
Find the critical value for the Fm,n−k −1 distribution from the tables
α
(let’s call it Fm,n−k −1 ). Remember:
In the book you will ﬁnd tables for the Fn1 /n2 distribution. In our
notation, n1 = m and n2 = n − k − 1.
You will ﬁnd separate tables for diﬀerent signiﬁcance levels (1%, 5%
and 10%).
When n is large (typically n − k − 1 > 120), the F statistic has a
distribution Fm,∞ , which can also be found on the tables. 11 α
Reject if F act > Fm,n−k −1 . Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients 9 10 Compute the actual value of the F statistic, F act .
Find the critical value for the Fm,n−k −1 distribution from the tables
α
(let’s call it Fm,n−k −1 ). Remember:
In the book you will ﬁnd tables for the Fn1 /n2 distribution. In our
notation, n1 = m and n2 = n − k − 1.
You will ﬁnd separate tables for diﬀerent signiﬁcance levels (1%, 5%
and 10%).
When n is large (typically n − k − 1 > 120), the F statistic has a
distribution Fm,∞ , which can also be found on the tables. 11 α
Reject if F act > Fm,n−k −1 . Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients 9 10 Compute the actual value of the F statistic, F act .
Find the critical value for the Fm,n−k −1 distribution from the tables
α
(let’s call it Fm,n−k −1 ). Remember:
In the book you will ﬁnd tables for the Fn1 /n2 distribution. In our
notation, n1 = m and n2 = n − k − 1.
You will ﬁnd separate tables for diﬀerent signiﬁcance levels (1%, 5%
and 10%).
When n is large (typically n − k − 1 > 120), the F statistic has a
distribution Fm,∞ , which can also be found on the tables. 11 α
Reject if F act > Fm,n−k −1 . Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients 9 10 Compute the actual value of the F statistic, F act .
Find the critical value for the Fm,n−k −1 distribution from the tables
α
(let’s call it Fm,n−k −1 ). Remember:
In the book you will ﬁnd tables for the Fn1 /n2 distribution. In our
notation, n1 = m and n2 = n − k − 1.
You will ﬁnd separate tables for diﬀerent signiﬁcance levels (1%, 5%
and 10%).
When n is large (typically n − k − 1 > 120), the F statistic has a
distribution Fm,∞ , which can also be found on the tables. 11 α
Reject if F act > Fm,n−k −1 . Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Notes about the F test:
The constrained model always has fewer parameters than the
unconstrained one.
F act is always nonnegative
F act measures the relative increase in RSS when moving from the
unconstrained model to the constrained model.
It can be shown that:
F= Maria Casanova 2
2
R U −R C
m
2
(1−RU )
n −k −1 Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Notes about the F test:
The constrained model always has fewer parameters than the
unconstrained one.
F act is always nonnegative
F act measures the relative increase in RSS when moving from the
unconstrained model to the constrained model.
It can be shown that:
F= Maria Casanova 2
2
R U −R C
m
2
(1−RU )
n −k −1 Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Notes about the F test:
The constrained model always has fewer parameters than the
unconstrained one.
F act is always nonnegative
F act measures the relative increase in RSS when moving from the
unconstrained model to the constrained model.
It can be shown that:
F= Maria Casanova 2
2
R U −R C
m
2
(1−RU )
n −k −1 Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Notes about the F test:
The constrained model always has fewer parameters than the
unconstrained one.
F act is always nonnegative
F act measures the relative increase in RSS when moving from the
unconstrained model to the constrained model.
It can be shown that:
F= Maria Casanova 2
2
R U −R C
m
2
(1−RU )
n −k −1 Lecture 10 +Total
85266.962 65240 1.30697367 Adj Rsquared = 0.1557
Root MSE = 1.0505 3. Hypothesis Tests Regarding Multiple Coeﬃcients
lincome
Coef.
Std. Err. t
P>t [95% Conf. Interval]
+Let’s proceed with the example. We start by estimating the .19492
education .1914982 .0017458 109.69 0.000 .1880764
unconstrained model:
_cons
7.261253 .0234553 309.58 0.000 7.215281
7.307225
. gen fem_ed=female*education
. reg wage female education fem_ed
Source
SS
df
MS
+Model
17665.6042
3 5888.53474
Residual
67601.3577 65237 1.03624259
+Total
85266.962 65240 1.30697367 Number of obs = 65241
F( 3, 65237) = 5682.58
Prob > F = 0.0000
Rsquared = 0.2072
Adj Rsquared = 0.2071
Root MSE = 1.018 wage
Coef.
Std. Err. t
P>t [95% Conf. Interval]
+female
 .498354 .0458689 10.86 0.000 .588257 .408451
education  .1937551 .0022496 86.13 0.000 .1893459 .1981643
fem_ed
.0015693 .0034135 0.46 0.646 .0082598 .0051212
_cons
 7.481399 .0301938 247.78 0.000 7.422219 7.540579
. test female fem_ed
( 1) female = 0 Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients
Next we estimate the constrained EXAMPLE
STATA EXAMPLE: FTEST model:
. reg wage education
Source
SS
df
MS
+Model
13276.8994
1 13276.8994
Residual
71990.0625 65239 1.10348201
+Total
85266.962 65240 1.30697367 Number of obs = 65241
F( 1, 65239) =12031.82
Prob > F = 0.0000
Rsquared = 0.1557
Adj Rsquared = 0.1557
Root MSE = 1.0505 Std. Err. t
P>t [95% Conf. Interval]
wage
Coef.
+education .1914982 .0017458 109.69 0.000 .1880764
.19492
_cons
7.261253 .0234553 309.58 0.000 7.215281
7.307225
. gen fem_ed=female*education
. reg lincome female education fem_ed
Source
SS
df
MS
+Maria Casanova
Lecture 10 Number of obs = 65241
F( 3, 65237) = 5682.58 3. Hypothesis Tests Regarding Multiple Coeﬃcients The F statistic is:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) = (71990−67601)
2
67601
(65241−3−1) = 2117 α
The critical value F2,∞ , for α = 5%, is 3.00. Since 2117 > 3, we reject the null hypothesis.
β1 and β3 are jointly signiﬁcant (wages are not the same for men and
women). Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients The F statistic is:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) = (71990−67601)
2
67601
(65241−3−1) = 2117 α
The critical value F2,∞ , for α = 5%, is 3.00. Since 2117 > 3, we reject the null hypothesis.
β1 and β3 are jointly signiﬁcant (wages are not the same for men and
women). Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients The F statistic is:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) = (71990−67601)
2
67601
(65241−3−1) = 2117 α
The critical value F2,∞ , for α = 5%, is 3.00. Since 2117 > 3, we reject the null hypothesis.
β1 and β3 are jointly signiﬁcant (wages are not the same for men and
women). Maria Casanova Lecture 10 3. Hypothesis Tests Regarding Multiple Coeﬃcients The F statistic is:
F= (RSSC −RSSU )
m
RSSU
(n−k −1) = (71990−67601)
2
67601
(65241−3−1) = 2117 α
The critical value F2,∞ , for α = 5%, is 3.00. Since 2117 > 3, we reject the null hypothesis.
β1 and β3 are jointly signiﬁcant (wages are not the same for men and
women). Maria Casanova Lecture 10 . reg lincome female education fem_ed 3. Hypothesis Tests Regarding MS
Multiple Coeﬃcients
Source
SS
df
Number of obs = 65241
+F( 3, 65237) = 5682.58
Model
17665.6042
3 5888.53474
Prob > F = 0.0000
Residual
67601.3577 65237 1.03624259
Rsquared = 0.2072
+Adj Rsquared = 0.2071
run the test directly 65240 1.30697367 either of the 1.018
Total
85266.962 in Stata, using
Root MSE = following We can
commands: Std. Err. t
P>t [95% Conf. Interval]
+female
 .498354
test female=fem ed=0 .0458689 10.86 0.000 .588257 .408451
education  .1937551 .0022496 86.13 0.000 .1893459 .1981643
fem_ed
.0015693 .0034135 0.46 0.646 .0082598 .0051212
These are_cons results:
the
 7.481399 .0301938 247.78 0.000 7.422219 7.540579
lincome
Coef.
test female fem ed . test female fem_ed
( 1) female = 0
( 2) fem_ed = 0
F( 2, 65237) = 2117.60
Prob > F = 0.0000
. test female fem_ed education
( 1) female = 0
( 2) fem_ed = 0
( 3) education = 0
M
F( 3, 65237) = 5682.58aria Casanova Lecture 10 . reg lincome female education fem_ed 3. Hypothesis Tests Regarding MS
Multiple Coeﬃcients
Source
SS
df
Number of obs = 65241
+F( 3, 65237) = 5682.58
Model
17665.6042
3 5888.53474
Prob > F = 0.0000
Residual
67601.3577 65237 1.03624259
Rsquared = 0.2072
+Adj Rsquared = 0.2071
run the test directly 65240 1.30697367 either of the 1.018
Total
85266.962 in Stata, using
Root MSE = following We can
commands: Std. Err. t
P>t [95% Conf. Interval]
+female
 .498354
test female=fem ed=0 .0458689 10.86 0.000 .588257 .408451
education  .1937551 .0022496 86.13 0.000 .1893459 .1981643
fem_ed
.0015693 .0034135 0.46 0.646 .0082598 .0051212
These are_cons results:
the
 7.481399 .0301938 247.78 0.000 7.422219 7.540579
lincome
Coef.
test female fem ed . test female fem_ed
( 1) female = 0
( 2) fem_ed = 0
F( 2, 65237) = 2117.60
Prob > F = 0.0000
. test female fem_ed education
( 1) female = 0
( 2) fem_ed = 0
( 3) education = 0
M
F( 3, 65237) = 5682.58aria Casanova Lecture 10 . reg lincome female education fem_ed 3. Hypothesis Tests Regarding MS
Multiple Coeﬃcients
Source
SS
df
Number of obs = 65241
+F( 3, 65237) = 5682.58
Model
17665.6042
3 5888.53474
Prob > F = 0.0000
Residual
67601.3577 65237 1.03624259
Rsquared = 0.2072
+Adj Rsquared = 0.2071
run the test directly 65240 1.30697367 either of the 1.018
Total
85266.962 in Stata, using
Root MSE = following We can
commands: Std. Err. t
P>t [95% Conf. Interval]
+female
 .498354
test female=fem ed=0 .0458689 10.86 0.000 .588257 .408451
education  .1937551 .0022496 86.13 0.000 .1893459 .1981643
fem_ed
.0015693 .0034135 0.46 0.646 .0082598 .0051212
These are_cons results:
the
 7.481399 .0301938 247.78 0.000 7.422219 7.540579
lincome
Coef.
test female fem ed . test female fem_ed
( 1) female = 0
( 2) fem_ed = 0
F( 2, 65237) = 2117.60
Prob > F = 0.0000
. test female fem_ed education
( 1) female = 0
( 2) fem_ed = 0
( 3) education = 0
M
F( 3, 65237) = 5682.58aria Casanova Lecture 10 . reg lincome female education fem_ed 3. Hypothesis Tests Regarding MS
Multiple Coeﬃcients
Source
SS
df
Number of obs = 65241
+F( 3, 65237) = 5682.58
Model
17665.6042
3 5888.53474
Prob > F = 0.0000
Residual
67601.3577 65237 1.03624259
Rsquared = 0.2072
+Adj Rsquared = 0.2071
run the test directly 65240 1.30697367 either of the 1.018
Total
85266.962 in Stata, using
Root MSE = following We can
commands: Std. Err. t
P>t [95% Conf. Interval]
+female
 .498354
test female=fem ed=0 .0458689 10.86 0.000 .588257 .408451
education  .1937551 .0022496 86.13 0.000 .1893459 .1981643
fem_ed
.0015693 .0034135 0.46 0.646 .0082598 .0051212
These are_cons results:
the
 7.481399 .0301938 247.78 0.000 7.422219 7.540579
lincome
Coef.
test female fem ed . test female fem_ed
( 1) female = 0
( 2) fem_ed = 0
F( 2, 65237) = 2117.60
Prob > F = 0.0000
. test female fem_ed education
( 1) female = 0
( 2) fem_ed = 0
( 3) education = 0
M
F( 3, 65237) = 5682.58aria Casanova Lecture 10 4. Overall Regression F Statistic A special type of F test is the one of the hypothesis that all slope
coeﬃcients equal to 0.
In this case we test whether none of the X variables has any explanatory
power (i.e. whether all β ’s except for the constant are equal to 0).
Notice that for this particular test RSSC =TSS. To see why, notice that
the constrained regression:
Y = β0 + ε
ˆ
yields β0 = µY . The RSS will pick up any variation around the mean, i.e.
the TSS. Maria Casanova Lecture 10 4. Overall Regression F Statistic A special type of F test is the one of the hypothesis that all slope
coeﬃcients equal to 0.
In this case we test whether none of the X variables has any explanatory
power (i.e. whether all β ’s except for the constant are equal to 0).
Notice that for this particular test RSSC =TSS. To see why, notice that
the constrained regression:
Y = β0 + ε
ˆ
yields β0 = µY . The RSS will pick up any variation around the mean, i.e.
the TSS. Maria Casanova Lecture 10 4. Overall Regression F Statistic A special type of F test is the one of the hypothesis that all slope
coeﬃcients equal to 0.
In this case we test whether none of the X variables has any explanatory
power (i.e. whether all β ’s except for the constant are equal to 0).
Notice that for this particular test RSSC =TSS. To see why, notice that
the constrained regression:
Y = β0 + ε
ˆ
yields β0 = µY . The RSS will pick up any variation around the mean, i.e.
the TSS. Maria Casanova Lecture 10 4. Overall Regression F Statistic A special type of F test is the one of the hypothesis that all slope
coeﬃcients equal to 0.
In this case we test whether none of the X variables has any explanatory
power (i.e. whether all β ’s except for the constant are equal to 0).
Notice that for this particular test RSSC =TSS. To see why, notice that
the constrained regression:
Y = β0 + ε
ˆ
yields β0 = µY . The RSS will pick up any variation around the mean, i.e.
the TSS. Maria Casanova Lecture 10 4. Overall Regression F Statistic
In this case, the F statistic is:
F= TSSC −RSSU
m
RSSU
n−k −1 But TSSC = TSSU . Therefore:
F= TSSU −RSSU
m
RSSU
n −k −1 = ESSU
m
RSSU
n−k −1 Notice that we only need one regression to compute the value of the
F statistic.
This Fstatistic is called the overall regression Fstatistic. It is reported
by Stata below the number of observations.
Maria Casanova Lecture 10 4. Overall Regression F Statistic
In this case, the F statistic is:
F= TSSC −RSSU
m
RSSU
n−k −1 But TSSC = TSSU . Therefore:
F= TSSU −RSSU
m
RSSU
n −k −1 = ESSU
m
RSSU
n−k −1 Notice that we only need one regression to compute the value of the
F statistic.
This Fstatistic is called the overall regression Fstatistic. It is reported
by Stata below the number of observations.
Maria Casanova Lecture 10 4. Overall Regression F Statistic
In this case, the F statistic is:
F= TSSC −RSSU
m
RSSU
n−k −1 But TSSC = TSSU . Therefore:
F= TSSU −RSSU
m
RSSU
n −k −1 = ESSU
m
RSSU
n−k −1 Notice that we only need one regression to compute the value of the
F statistic.
This Fstatistic is called the overall regression Fstatistic. It is reported
by Stata below the number of observations.
Maria Casanova Lecture 10 4. Overall Regression F Statistic
In this case, the F statistic is:
F= TSSC −RSSU
m
RSSU
n−k −1 But TSSC = TSSU . Therefore:
F= TSSU −RSSU
m
RSSU
n −k −1 = ESSU
m
RSSU
n−k −1 Notice that we only need one regression to compute the value of the
F statistic.
This Fstatistic is called the overall regression Fstatistic. It is reported
by Stata below the number of observations.
Maria Casanova Lecture 10 4. Overall Regression F Statistic
In this case, the F statistic is:
F= TSSC −RSSU
m
RSSU
n−k −1 But TSSC = TSSU . Therefore:
F= TSSU −RSSU
m
RSSU
n −k −1 = ESSU
m
RSSU
n−k −1 Notice that we only need one regression to compute the value of the
F statistic.
This Fstatistic is called the overall regression Fstatistic. It is reported
by Stata below the number of observations.
Maria Casanova Lecture 10 4. Overall Regression F Statistic P>t [95% Conf. Interval]
lincome
Coef.
Std. Err. t
+education .1914982 .0017458 109.69 0.000 .1880764
.19492
_cons
7.261253 .0234553 309.58 0.000 7.215281
7.307225
. gen fem_ed=female*education
. reg wage female education fem_ed
Source
SS
df
MS
+Model
17665.6042
3 5888.53474
Residual
67601.3577 65237 1.03624259
+Total
85266.962 65240 1.30697367 Number of obs = 65241
F( 3, 65237) = 5682.58
Prob > F = 0.0000
Rsquared = 0.2072
Adj Rsquared = 0.2071
Root MSE = 1.018 wage
Coef.
Std. Err. t
P>t [95% Conf. Interval]
+female
 .498354 .0458689 10.86 0.000 .588257 .408451
education  .1937551 .0022496 86.13 0.000 .1893459 .1981643
fem_ed
.0015693 .0034135 0.46 0.646 .0082598 .0051212
_cons
 7.481399 .0301938 247.78 0.000 7.422219 7.540579
. test female fem_ed
( 1) female = 0
( 2) fem_ed = 0
F( 2, 65237) = 2117.60aria Casanova
M Lecture 10 ...
View Full
Document
 Spring '07
 SandraBlack

Click to edit the document details