1
Distribution of Estimates and
Multivariate Regression
Lecture XXVII
I.
Models and Distributional Assumptions
A.
Conditional Normal Model
1.
The conditional normal model assumes that the observed random
variables are distributed
2
~
,
i
i
y
N
x
Thus,
i
i
i
E
y x
x
and the variance of
i
y
equals
2
. The conditional normal can be
expressed as
2
~
0,
i
i
i
i
y
x
N
Further, the
i
are independently and identically distributed
(consistent with our BLUE proof).
2.
Given this formulation, the likelihood function for the simple
linear model can be written:
2
2
2
1
1
, ,
exp
2
2
n
i
i
i
y
x
L
x
Taking the log of this likelihood function yields:
2
2
2
1
1
ln
ln 2
ln
2
2
2
n
i
i
i
n
n
L
y
x
As discussed in Lecture XVII, this likelihood function can be
concentrated in such a way so that
2
2
2
1
ˆ
ln
ln
2
2
1
ˆ
n
i
i
i
n
n
L
y
x
n

This preview has intentionally blurred sections.
Sign up to view the full version.
AEB 6933
–
Mathematical Statistics for Food and Resource Economics
Lecture XXIX
Professor Charles Moss
Fall 2007
2
So that the least squares estimator are also maximum likelihood
estimators if the error terms are normal.
3.
Proof of the variance of
can be derived from the Gauss-Markov
results.
a)
Note from last lecture:
1
1
1
1
1
ˆ
n
n
i
i
i
i
i
i
i
xx
n
n
n
i
i
i
i
i
i
i
i
x
x
d y
x
S
d
d
x
d
(1)


This is the end of the preview.
Sign up
to
access the rest of the document.
- Fall '09
- CARRIKER
- Normal Distribution, xx xx xx, Professor Charles Moss
-
Click to edit the document details