note1_typed

# note1_typed - Lecture 1 Regression Analysis and Extension...

This preview shows pages 1–5. Sign up to view the full content.

Lecture 1 Regression Analysis and Extension (Review of Added Issues) Read: Wooldridge* Ch. 4 Verbeck* Ch. 2,3, and 4.10 Green Ch. 2,3,4,6

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Linear Algebra of OLS Estimation Classical Regression Model i ki k i i e x x y + + + + = β 1 1 0 , , , , 1 n i = or + = + + + + = n k kn n k n kn k k n n e e x x x x e e x x x x y y 1 0 1 1 11 1 1 1 11 1 0 1 1 1 1 1 n x (k + 1) (k + 1) (n x 1) or e x y + = or i i i e x y + = where ) , , , 1 ( 1 ki i i x x x = : 1 x (k + 1) or i i i e x y + = ' where ) , , , 1 ( 1 = ki i i x x x : (k + 1) x 1 we observe: y, x we do not observe: β, e Note – usually one includes an intercept 0 0 1 = i x is an intercept. (Thus, we drop x oi and make x 1i = 1; k regressions) - Multiple regression j ji i x y = other things being equal (partial effect) 2
Def. : The model y = xβ + e satisfies the ideal conditions if: a) The errors e i are independently, identically distributed ( iid ) and N ) , 0 ( 2 σ b) The regressors X are non-stochastic, has rank k+ 1 and X X n n 1 lim exists and is non-singular. a) Assumptions about e (never residuals) i.) i e normal 2200 i non-normal, robust estimation ii.) E 0 ) ( = i e 2200 i non-zero mean, specification error (omitted variables) iii.) var 2 ) ( = i e 2200 i heteroskedasticity, HAC error iv.) cov 0 ) , ( = j i e e 2200 j i autocorrelation, Dynamic models Note : If X is treated as random, these are assumptions about X e . ii) E 0 ) ( = X e i iii) var 0 ) ( = i i x e b) Assumptions about X i.) Un-stochastic, fixed stochastic regression ii.) Rank ( X ) = k multicollinearity iii.) 2 nd moment exists integrated regression = n i mi si sm X X n X X n 1 ) 1 ( exists. c) Assumption about both i.) cov 0 ) , ( = e x j , j = 1, … , k x j is “endogenous” Note : Wooldridge (p. 50) discusses three sources of endogeneity: - Omitted variable E 0 ) , cov( 0 ) ( e x x e j j i - Measurement error i j j u x x + = * ( * j x true measures) - Simultaneity (big issue) 3

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
OLS Estimation Min ) ( ) ( β X y X y e e - - = X X y X X y y y + - - = ' 0 2 ) ' ( let X X y X y X e e = + - - = y X X X = - 1 ) ( ˆ Alternatively, Min - - = ) ( ) ( i i i i X y X y e where ) , , ( 1 ki i i x x x = (1 x k ) = - ) ( ) ( ˆ 1 i i i i y X X X Note : Differentiation of a scalar with respect to a vector Let z : scalar (1 x 1) θ : vector ( ρ x 1) , ) , , ( 1 = = z z z , , 1 i.) If a and b are x 1 vectors and b a z = or a b is a scalar (1 x 1) then, b a b a = , a b b a = ii.) If β is a symmetric matrix ) ( × , and c is a vector ) 1 ( × , then c c is a scalar.
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 01/19/2010 for the course ECON 6140 taught by Professor Stengos during the Spring '10 term at University of Guelph.

### Page1 / 26

note1_typed - Lecture 1 Regression Analysis and Extension...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online