This preview shows pages 1–6. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Distribution of Estimates and Multivariate Regression: Lecture XXIX Charles B. Moss November 16, 2010 Charles B. Moss () Distribution and Multivariate Regression November 16, 2010 1 / 17 1 Models and Distributional Assumptions 2 Multivariate Regression Models GaussMarkov Theorem Charles B. Moss () Distribution and Multivariate Regression November 16, 2010 2 / 17 Models and Distributional Assumptions Conditional Normal Model I The conditional normal model assumes that the observed random variables are distributed y i N ( + x i , 2 ) (1) Thus, E [ y i  x i ] = + x i (2) Charles B. Moss () Distribution and Multivariate Regression November 16, 2010 3 / 17 Continued I and the variance of y i equals 2 . The conditional normal can be expressed as y i = + x i + i i N ( , 2 ) (3) Further, the are independently and identically distributed (consistent with our BLUE proof). I Given this formulation, the likelihood function for the simple linear model can be written L ( , , 2  x ) = n Y i =1 1 2 exp " ( y i ( + x i )) 2 2 2 # (4) Taking the log of this likelihood function yields ln ( L ) = n 2 ln (2 ) n 2 ln ( 2 ) 1 2 2 n X i =1 ( y i x i ) 2 (5) Charles B. Moss () Distribution and Multivariate Regression November 16, 2010 4 / 17 Continued I As discussed in Lecture XVII, this likelihood function can be concentrated in such a way so that ln ( L ) n 2 ln ( 2 ) n 2 2 = 1 n n X i =1 ( y i x i ) 2 (6) So that the least squares estimator are also maximum likelihood estimators if the error terms are normal....
View Full
Document
 Spring '10
 Staff

Click to edit the document details