Lecture 29-2007 - Distribution of Estimates and...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Distribution of Estimates and Multivariate Regression Lecture XXVII I. Models and Distributional Assumptions A. Conditional Normal Model 1. The conditional normal model assumes that the observed random variables are distributed 2 ~, ii y N x Thus, i i i E y x x and the variance of i y equals 2 . The conditional normal can be expressed as 2 ~ 0, i i i i yx N Further, the i are independently and identically distributed (consistent with our BLUE proof). 2. Given this formulation, the likelihood function for the simple linear model can be written: 2 2 2 1 1 , , exp 2 2 n i Lx Taking the log of this likelihood function yields: 2 2 2 1 1 ln ln 2 ln 2 2 2 n i nn L y x As discussed in Lecture XVII, this likelihood function can be concentrated in such a way so that 2 2 2 1 ˆ ln ln 22 1 ˆ n i L n
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
AEB 6933 – Mathematical Statistics for Food and Resource Economics Lecture XXIX Professor Charles Moss Fall 2007 2 So that the least squares estimator are also maximum likelihood estimators if the error terms are normal. 3. Proof of the variance of can be derived from the Gauss-Markov results. a) Note from last lecture: 11 1 1 1 ˆ nn i i i i i ii xx n n n i i i i i i i i xx d y x S d d x d (1) Under our standard assumptions about the error
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 6

Lecture 29-2007 - Distribution of Estimates and...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online