11-2
Solutions Manual for Statistical Inference
Thus
v
(
y
) =
y
-
2(
λ
-
1)
.
From Exercise 11.1,
Var
y
λ
-
1
λ
≈
d
dy
θ
λ
-
1
λ
2
v
(
θ
) =
θ
2(
λ
-
1)
θ
-
2(
λ
-
1)
= 1
.
Note: If
λ
= 1
/
2,
v
(
θ
) =
θ
, which agrees with Exercise 11.2(a). If
λ
= 1 then
v
(
θ
) =
θ
2
,
which agrees with Exercise 11.2(c).
11.5 For the model
Y
ij
=
μ
+
τ
i
+
ε
ij
,
i
= 1
, . . . , k,
j
= 1
, . . . , n
i
,
take
k
= 2. The two parameter configurations
(
μ, τ
1
, τ
2
)
=
(10
,
5
,
2)
(
μ, τ
1
, τ
2
)
=
(7
,
8
,
5)
,
have the same values for
μ
+
τ
1
and
μ
+
τ
2
, so they give the same distributions for
Y
1
and
Y
2
.
11.6 a. Under the ANOVA assumptions
Y
ij
=
θ
i
+
ij
, where
ij
∼
independent n(0
, σ
2
), so
Y
ij
∼
independent n(
θ
i
, σ
2
). Therefore the sample pdf is
k
i
=1
n
i
j
=1
(2
πσ
2
)
-
1
/
2
e
-
(
y
ij
-
θ
i
)
2
2
σ
2
=
(2
πσ
2
)
-
Σ
n
i
/
2
exp
-
1
2
σ
2
k
i
=1
n
i
j
=1
(
y
ij
-
θ
i
)
2
=
(2
πσ
2
)
-
Σ
n
i
/
2
exp
-
1
2
σ
2
k
i
=1
n
i
θ
2
i
×
exp
-
1
2
σ
2
i
j
y
2
ij
+
2
2
σ
2
k
i
=1
θ
i
n
i
¯
Y
i
·
.
Therefore, by the Factorization Theorem,
¯
Y
1
·
,
¯
Y
2
·
, . . . ,
¯
Y
k
·
,
i
j
Y
2
ij
is jointly sufficient for
(
θ
1
, . . . , θ
k
, σ
2
)
. Since (
¯
Y
1
·
, . . . ,
¯
Y
k
·
, S
2
p
) is a 1-to-1 function of this
vector, (
¯
Y
1
·
, . . . ,
¯
Y
k
·
, S
2
p
) is also jointly sufficient.
b. We can write
(2
πσ
2
)
-
Σ
n
i
/
2
exp
-
1
2
σ
2
k
i
=1
n
i
j
=1
(
y
ij
-
θ
i
)
2
=
(2
πσ
2
)
-
Σ
n
i
/
2
exp
-
1
2
σ
2
k
i
=1
n
i
j
=1
([
y
ij
-
¯
y
i
·
] + [¯
y
i
·
-
θ
i
])
2
=
(2
πσ
2
)
-
Σ
n
i
/
2
exp
-
1
2
σ
2
k
i
=1
n
i
j
=1
[
y
ij
-
¯
y
i
·
]
2
exp
-
1
2
σ
2
k
i
=1
n
i
[¯
y
i
·
-
θ
i
]
2
,
so, by the Factorization Theorem,
¯
Y
i
·
,
i
= 1
, . . . , n
, is independent of
Y
ij
-
¯
Y
i
·
,
j
= 1
, . . . , n
i
,
so
S
2
p
is independent of each
¯
Y
i
·
.
c. Just identify
n
i
¯
Y
i
·
with
X
i
and redefine
θ
i
as
n
i
θ
i
.
