This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: n which has the property plim S n = . The exact chisquared result no longer holds, but the limiting distribution is the same as if the true were used. ˜˜˜™™ ™ 19/38 Part 12: Asymptotics for the Regression Model Building the Wald Statistic2 Suppose the statistic is computed not with an x that has an exact normal distribution, but with an x n which has a limiting normal distribution , but whose finite sample distribution might be something else. Our earlier results for functions of random variables give us the result ( x n  ) Sn1 ( xn ) 2[K] (!!!)VVIR! Note that in fact, nothing in this relies on the normal distribution. What we used is consistency of a certain estimator ( S n) and the central limit theorem for x n. ˜˜˜™™ ™ 20/38 Part 12: Asymptotics for the Regression Model General Result for Wald Distance The Wald distance measure: If plim xn = , xn is asymptotically normally distributed with a mean of and variance , and if S n is a consistent estimator of , then the Wald statistic, which is a generalized distance measure between x n converges to a chi squared variate. ( x n  ) Sn1 ( x n  ) 2[K] ˜˜˜™™ ™ 21/38 Part 12: Asymptotics for the Regression Model The F Statistic An application: (Familiar) Suppose b n is the least squares estimator of based on a sample of n observations. No assumption of normality of the disturbances or about nonstochastic regressors is made. The standard F statistic for testing the hypothesis H0: R  q = is F[J, nK] = [( e*’e* e’e )/J] / [ e’e / (nK)] where this is built of two sums of squared residuals. The statistic does not have an F distribution. How can we test the hypothesis? ˜˜˜™™ ™ 22/38 Part 12: Asymptotics for the Regression Model JF is a Wald Statistic F[J,nK] = (1/J) ( Rbn  q ) [ R s2( XX )1 R’ ]1 ( Rbn  q ). Write m = ( Rbn  q ). Under the hypothesis, plim m = . n m N[0, R (2/n) Q1 R ’] Estimate the variance with R (s2/n)( X’X /n)1 R ’] Then, (n m )’ [Est.Var(n m )]1 (n m ) fits exactly into the apparatus developed earlier. If plim b n = , plim s2 = 2, and the other asymptotic results we developed for least squares hold, then JF[J,nK] 2[J]. ˜˜˜™™ ™ 23/38 Part 12: Asymptotics for the Regression Model Application: Wald Tests read;nobs=27;nvar=10;names= Year, G , Pg, Y , Pnc , Puc , Ppt , Pd , Pn , Ps $ 1960 129.7 .925 6036 1.045 .836 .810 .444 .331 .302 1961 131.3 .914 6113 1.045 .869 .846 .448 .335 .307 1962 137.1 .919 6271 1.041 .948 .874 .457 .338 .314 1963 141.6 .918 6378 1.035 .960 .885 .463 .343 .320 1964 148.8 .914 6727 1.032 1.001 .901 .470 .347 .325 1965 155.9 .949 7027 1.009 .994 .919 .471 .353 .332 1966 164.9 .970 7280 .991 .970 .952 .475 .366 .342 1967 171.0 1.000 7513 1.000 1.000 1.000 .483 .375 .353 1968 183.4 1.014 7728 1.028 1.028 1.046 .501 ....
View
Full Document
 Fall '10
 H.Bierens
 Econometrics, Least Squares, Regression Analysis, Variance, regression model, Wald

Click to edit the document details