Econometrics-I-12

# Part 12 asymptotics for the regression model setting

This preview shows pages 34–39. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Part 12: Asymptotics for the Regression Model Setting Up the Wald Statistic To do the Wald test, I first need to estimate the asymptotic covariance matrix for the sample estimates of 1 and 2. After estimating the regression by least squares, the estimates are f1 = b4/b5 - b7/b8 f2 = b4/b5 - b9/b8. Then, using the delta method, I will estimate the asymptotic variances of f1 and f2 and the asymptotic covariance of f1 and f2. For this, write f1 = f1( b ), that is a function of the entire 101 coefficient vector. Then, I compute the 110 derivative vectors, d 1 = f1( b )/ b and d2 = f2( b )/ b These vectors are 1 2 3 4 5 6 7 8 9 10 d 1 = 0, 0, 0, 1/b5, -b4/b52, 0, -1/b8, b7/b82, 0, d 2 = 0, 0, 0, 1/b5, -b4/b52, 0, 0, b9/b82, -1/b8, &#152;&#152;&#152;&#152;™ ™ 33/38 Part 12: Asymptotics for the Regression Model Wald Statistics Then, D = the 210 matrix with first row d 1 and second row d 2. The estimator of the asymptotic covariance matrix of [f1,f2] (a 21 column vector) is V = D s2 ( XX )-1 D. Finally, the Wald test of the hypothesis that = 0 is carried out by using the chi-squared statistic W = ( f-0)V-1(f-0) . This is a chi-squared statistic with 2 degrees of freedom. The critical value from the chi- squared table is 5.99, so if my sample chi-squared statistic is greater than 5.99, I reject the hypothesis. &#152;&#152;&#152;&#152; ™ 34/38 Part 12: Asymptotics for the Regression Model Wald Test In the example below, to make this a little simpler, I computed the 10 variable regression, then extracted the 51 subvector of the coefficient vector c = (b4,b5,b7,b8,b9) and its associated part of the 1010 covariance matrix. Then, I manipulated this smaller set of values. &#152;&#152;&#152;&#152; &#152;™ 35/38 Part 12: Asymptotics for the Regression Model Application of the Wald Statistic ? Extract subvector and submatrix for the test matrix;list ; c =[b(4)/b(5)/b(7)/b(8)/b(9)]\$ matrix;list ; vc=[varb(4,4)/ varb(5,4),varb(5,5)/ varb(7,4),varb(7,5),varb(7,7)/ varb(8,4),varb(8,5),varb(8,7),varb(8,8)/ varb(9,4),varb(9,5),varb(9,7),varb(9,8),varb(9,9)]\$ ? Compute derivatives calc ;list ; g11=1/c(2); g12=-c(1)*g11*g11; g13=-1/c(4); g14=c(3)*g13*g13 ; g15=0 ; g21=g11 ; g22=g12 ; g23=0 ; g24=c(5)/c(4)^2 ; g25=-1/c(4)\$ ? Move derivatives to matrix matrix;list; dfdc=[g11,g12,g13,g14,g15 / g21,g22,g23,g24,g25]\$ ? Compute functions, then move to matrix and compute Wald statistic calc;list ; f1=c(1)/c(2) - c(3)/c(4) ; f2=c(1)/c(2) - c(5)/c(4) \$ matrix ; list; f = [f1/f2]\$ matrix ; list; vf=dfdc * vc * dfdc' \$ matrix ; list ; wald = f' * <vf> * f\$ (This is all automated in the WALD command.) &#152;&#152;&#152;&#152; &#152;™ 36/38 Part 12: Asymptotics for the Regression Model Computations Matrix C is 5 rows by 1 columns....
View Full Document

{[ snackBarMessage ]}

### Page34 / 39

Part 12 Asymptotics for the Regression Model Setting Up the...

This preview shows document pages 34 - 39. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online