This preview shows pages 1–3. Sign up to view the full content.
Chapter 3
Multiple regression analysis
Overview
This chapter introduces regression models with more than one explanatory variable.
Specific topics are treated
with reference to a model with just two explanatory variables, but most of the concepts and results apply
straightforwardly to more general models.
The chapter begins by showing how the least squares principle is
employed to derive the expressions for the regression coefficients and how the coefficients should be interpreted.
It continues with a discussion of the precision of the regression coefficients and tests of hypotheses relating to
them.
Next comes multicollinearity, the problem of discriminating between the effects of individual explanatory
variables when they are closely related.
The chapter concludes with a discussion of
F
tests of the joint
explanatory power of the explanatory variables or subsets of them, and shows how a
t
test can be thought of as a
marginal
F
test.
Learning outcomes
After working through the corresponding chapter in the text, studying the corresponding slideshows, and doing
the starred exercises in the text and the additional exercises in this guide, you should be able to explain:
the principles behind the derivation of multiple regression coefficients (but you are not expected to learn the
expressions for them or to be able to reproduce the mathematical proofs)
•
•
•
•
•
•
•
•
•
•
•
how to interpret the regression coefficients
the Frisch–Waugh–Lovell graphical representation of the relationship between the dependent variable and
one explanatory variable, controlling for the influence of the other explanatory variables
the properties of the multiple regression coefficients
what factors determine the population variance of the regression coefficients
what is meant by multicollinearity
what measures may be appropriate for alleviating multicollinearity
what is meant by a linear restriction
the
F
test of the joint explanatory power of the explanatory variables
the
F
test of the explanatory power of a group of explanatory variables
why
t
tests on the slope coefficients are equivalent to marginal
F
tests.
You should know the expression for the population variance of a slope coefficient in a multiple regression model
with two explanatory variables.
10.10.07
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document2
Additional exercises
A3.1
The output shows the result of regressing
FDHO
, expenditure on food consumed at home, on
EXP
, total
household expenditure, and
SIZE
, number of persons in the household, using the
CES
data set.
Provide
an interpretation of the regression coefficients and perform appropriate tests.
. reg FDHO EXP SIZE if FDHO>0
Source 
SS
df
MS
Number of obs =
868
+
F(
2,
865) =
426.78
Model 
1.4826e+09
2
741314291
Prob > F
=
0.0000
Residual 
1.5025e+09
865
1736978.64
Rsquared
=
0.4967
+
Adj Rsquared =
0.4955
Total 
2.9851e+09
867
3443039.33
Root MSE
=
1317.9

FDHO 
Coef.
Std. Err.
t
P>t
[95% Conf. Interval]
+
EXP 
.0372621
.0024547
15.18
0.000
.0324442
.04208
SIZE 
559.7692
30.85684
18.14
0.000
499.2061
620.3322
_cons 
884.5901
100.1537
8.83
0.000
688.0173
1081.163

A3.2
Perform a regression parallel to that in Exercise A3.1 for your
CES
category of expenditure, provide an
interpretation of the regression coefficients and perform appropriate tests.
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '10
 öcal
 Econometrics

Click to edit the document details