February 23, 2013
LECTURE 1
REVIEW OF GMM FOR LINEAR MODELS
Denition and asymptotic properties
Suppose that an econometrician observes the data cfw_(Yi , Xi , Zi ) : i = 1, . . . , n, and the model is given by
Yi
= Xi + Ui , and
E (Zi Ui )
=
0,
(1)
where
Econ 627
Assignment 1
The due date for this assignment is Monday January 20.
1. Let A be a symmetric matrix: A = A.
(a) Show that the determinant of A is equal to the product of its eigenvalues.
(b) Show that the trace of A is equal to the sum of its eige
UBC, ECONOMICS 627
2013 MIDTERM EXAMINATION
Suggested Solution
1. For part (a), when Wi is used as an instrument (for given B ), we have an exactly
identied model. Therefore, using the variance formula for the IV estimator, we have
V (B ) =
=
E Wi Xi
1
EU
March 20, 2013
LECTURE 7
STATIONARITY, ERGODICITY, WEAK DEPENDENCE
The material is adapted from Peter Phillips Lecture Notes on Stationary and Nonstationary Time Series
and White (1999).
Often econometricians have to deal with data sets that come in the f
FEBRUARY 4, 2014
LECTURE 2
SIMULTANEOUS EQUATIONS I: DEFINITION, IDENTIFICATION, INDIRECT LS,
SINGLE-EQUATION GMM
Denition
We consider the following system of equations :
Yi
= BZi + Ui ,
(1)
=
EZi Ui
(2)
0,
where Yi is an m-vector of endogenous variables:
FEBRUARY 4, 2014
LECTURE 3
SIMULTANEOUS EQUATIONS II: MULTIPLE-EQUATION GMM, 3SLS.
In this lecture, we consider joint GMM estimation of more than one simultaneous equation. As we will
see, joint estimation can lead to eciency gains.
Multiple-equation GMM
UBC, ECONOMICS 627
2012 MIDTERM EXAMINATION
Suggested Solutions
Question 1
Dene MV = In V (V V )1 V . Using the partitioned regression result, = (X MV X )1 X MV Y .
Next,
X MV X = X X X V (V V )1 V X
= X X X MZ X (X MZ X )1 X MZ X
= X X X MZ X
= XPZ X,
wh
UBC, Econ 627
2011 MIDTERM EXAMINATION
Suggested Solution
Question 1
(a) Assuming that 0 lies in the interior of and because g is continuously dierentiable
in A3, we can apply the mean-value theorem to g (n ) in (1) to obtain
op (n1/2 ) = G(n ) 1 (n g (n
September 22, 2009
VECTOR AND MATRIX DIFFERENTIATION
1
Derivatives of Ax
Let a 2 Rn ; x 2 Rn (all vectors are column vectors). Then
0
1
@ (a0 x)
B @x1 C
@ (a0 x)
B
C
.
.
=B
C
.
@x
@
A
@ (a0 x)
@x
0 @ (a n +:+a x ) 1
x
1
B
=B
@
0
1
n
n
@x1
.
.
.
@ (a1 x1 +