Hints for Assignment 2
Ling Chen
November 5, 2008
2.9. You may assume that you can interchange integration with differentiation.
Key steps to prove part (a) are the following:
E
(
∇
log
f
θ
(
X
))
=
E
∇
f
θ
(
X
)
f
θ
(
X
)
=
Z
∇
f
θ
(
X
)
f
θ
(
X
)
·
f
θ
(
X
)
dX
=
Z
∇
f
θ
(
X
)
dX
=
∇
Z
f
θ
(
X
)
dX
=
∇
(1) = 0
.
Note that I have simplified the notations. For part (b), you need to start
with
∇
2
l
(
θ
) =
∇
∇
f
θ
(
X
)
f
θ
(
X
)
=
f
θ
(
X
)
∇
2
f
θ
(
X
)
 ∇
f
θ
(
X
) (
∇
f
θ
(
X
))
T
(
f
θ
(
X
))
2
.
and use the same idea of interchange and the result in part (a).
3.2. Using the notations
Y
= (
y
1
, . . . , y
n
)
T
,
X
=
1
· · ·
1
x
1
· · ·
x
n
T
,
B
=
(
α, β
)
T
and
E
= (
1
, . . . ,
n
)
T
, we can write (3.24) as
Y
=
XB
+
E
, and
the OLS estimate for
B
is
ˆ
B
= (
X
T
X
)

1
X
T
Y
.
You can prove (3.27)
by following the arguments in Section 2.3.3 and using Definition 2.3 of
Wishart distribution.
The only thing that might make the multivariate
case more complicated is the following:
Lemma.
Suppose
E
= (
1
, . . . ,
n
)
T
, in which
t
are i.i.d.
N
(0
, V
),
W
=
AE
and
Z
=
BE
, where
A
and
B
are
a
×
n
and
b
×
n
matrices.
Then
w
i
, the
i
th row of
W
, and
z
j
, the
j
th row of
Z
are jointly normal
withe mean 0 and Cov(
w
i
, z
j
) = (
AB
T
)
ij
V
, where
M
ij
is the (
i, j
)th
entry of matrix
M
. In particular,
w
i
∼
N
(0
,
(
AA
T
)
ii
V
).
Proof.
w
i
=
∑
n
t
=1
A
it
t
and
z
j
=
∑
n
t
=1
B
jt
t
are linear combinations of
t
. This verifies that
w
i
and
z
j
are jointly normal with mean 0. Further
more,
Cov(
w
i
, z
j
)
=
Cov(
n
X
t
=1
A
it
t
,
n
X
t
=1
B
jt
t
) =
n
X
s
=1
n
X
t
=1
A
is
B
jt
Cov(
s
,
t
)
=
n
X
t
=1
A
it
B
jt
Cov(
t
,
t
) = (
n
X
t
=1
A
it
B
jt
)
V
= (
AB
T
)
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '09
 Trigraph, Covariance matrix

Click to edit the document details