This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Efficient Minimum Distance Estimation with Multiple Rates of Convergence * Bertille Antoine † and Eric Renault ‡ February 19, 2008 Abstract: This paper extends the asymptotic theory of GMM inference to allow sample counterparts of the estimating equations to converge at (multiple) rates, different from the usual squareroot of the sample size. In this setting, we provide consistent estimation of the structural parameters. In addition, we define a convenient rotation in the parameter space (or reparametrization) which permits to disentangle the different rates of convergence. More precisely, we identify special linear combinations of the structural parameters associated with a specific rate of convergence. Finally, we demonstrate the validity of usual inference procedures, like the overi dentification test and Wald test, with standard formulas. It is important to stress that both estimation and testing work without requiring the knowledge of the various rates. However, the assessment of these rates is crucial for (asymptotic) power considerations. Possible applications include econometric problems with two dimensions of asymptotics, due to trimming, tail estimation, infill asymptotic, social interactions, kernel smoothing or any kind of regularization. JEL classification: C32; C12; C13; C51. Keywords: GMM; Mixedrates asymptotics; Set estimation; Control variables; Rotation in the coordinate system. * We would like to thank M. Carrasco, A. Guay, J. Jacod, Y. Kitamura, P. Lavergne, L. Magee, A. Shaikh, V. ZindeWalsh, and seminar participants at University of British Columbia, University of Montreal, and Yale University for helpful discussions. † Simon Fraser University. Email: [email protected] ‡ University of North Carolina at Chapel Hill, CIRANO and CIREQ. Email: [email protected] 1 1 Introduction The cornerstone of GMM asymptotic distribution theory is that when an estimator ˆ θ T of some vector θ of parameters (with θ as true unknown value) is defined through a minimum distance problem: ˆ θ T = arg min θ £ m T ( θ )Ω m T ( θ ) / (1.1) h √ T ( ˆ θ T θ ) i inherits the asymptotic normality of [ √ Tm T ( θ )] by a firstorder expansion argument: √ T ( ˆ θ T θ ) = • ∂m T ( θ ) ∂θ Ω ∂m T ( θ ) ∂θ ‚ 1 ∂m T ( θ ) ∂θ Ω √ Tm T ( θ ) + o P (1) (1.2) while Plim T →∞ [ m T ( θ )] = 0 ⇐⇒ θ = θ (1.3) It turns out that, for many reasons (see section 2 for a list of examples and a review of the literature), including local smoothing, trimming, infill asymptotic, or any kind of non root T asymptotics, the asymptotic normality of m T ( θ ) may come at a nonstandard rate of convergence: £ T α m T ( θ ) / is asymptotically a nondegenerated gaussian variable for some α 6 = 1 / 2 . This does not invalidate the firstorder expansion argument (1.2) when realizing that h T α ( ˆ θ T θ ) i is asymptotically equivalent to £ T α m T ( θ ) / . For instance, Robert (2006) has....
View
Full Document
 Fall '08
 Staff
 Normal Distribution, Estimation theory, λt, Θt, Minimum distance

Click to edit the document details