Solutions to the Exercises of Section 4.7.
4.7.1. If the coin comes up heads, then d2 is minimax. It guarantees the statistician a loss of (at most)
1 , and nature by choosing = 1 can guarantee the statisticians loss to be at least 1 . Similarly, if
Solutions to the Exercises of Section 1.7.
1.7.1(a) The minimax point is the intersection of the line joining (2, 3) to (3/4, 9/4) and the line
x = y . The former line has slope 21/5 and equation
y 3 = (21/5)(x + 2).
Putting y = x and solving yields x = y
Solutions to the Exercises of Section 1.8.
1.8.1. E(Z b)2 = Var(Z ) + (EZ b)2 obviously takes on its minimum value of Var(Z ) when b = EZ .
1.8.2. We say b0 is a median of a random variable Z if P(Z b0 ) 1/2 and P(Z b0 ) 1/2 . Let b0
be any median of Z. F
Solution to Exercises 7.7.12 through 7.7.15.
7.7.12. (a) Let Z1 , Z2 , . . . be i.i.d. with M (t) = EetZ , and suppose for some 0 < < 1 that M (t1 ) =
M (t2 ) = 1 for some t1 = t2 . Since M is convex and M (0) = 1 , t1 and t2 are of opposite signs, so we
Solutions to Exercises 7.2.2 to 7.2.4, and 7.2.6 to 7.2.9.
7.2.2. At stage 0 , the Bayes expected loss is 0 ( a)2 /(1 ) d = for all a , so that
0 = . At stage j , the posterior distribution of given X1 = 1, . . . , Xj = 1 is still U (0, 1), so
j (1, . .
Solution to Exercises 6.3.3 through 6.3.5.
6.3.3. (a) The joint density of Y1 , . . . , Yk 1 under Hi for i = 0 is given by (6.22). Let Z1 = |Y1 |, Z2 =
Y2 sgnY1 , . . . , Zk 1 = Yk 1 sgnY1 . This is a two-to-one map and the density of Z1 , . . . , Zk 1 i
Solutions to the Exercises of Section 6.2.
6.2.1. First we show the hint: for z > 0 , 1 z + log z 0 (note the inequality is backward in the text).
Let g(z ) = 1 z + log z . Then g (z ) = 1 + (1/z ) and g (z ) = 1/z 2 . Thus, g(z ) reaches its maximum
Solutions to Exercises 6.1.1 through 6.1.3.
6.1.1. f (x|) = ( x )x (1 )5x has monotone likelihood ratio, so, if the loss function satises (6.3)
with 1 = 1/3 and 2 = 2/3 , we want to nd one-sided j s such that E1 1 (X ) = E1 1 (X ) and
E2 2 (X ) = E2 2
Solutions to Exercises 5.10.1 through 5.10.6.
5.10.1. Let x Er , Er c0 0 , and > 0 .
(a) If. Putting a = 1 (x ) into the right side gives (x )T 1 (x )2 c0 (x )T 1 (x ).
If (x )T 1 (x ) = 0 , then this may be cancelled from both sides, resulting in the des
Solutions to Exercises 5.9.3 through 5.9.9.
5. 9. 3.
(Xij i j )2
[(Xij Xi. X.j X.) + (Xi. X. i ) + (X.j X. j ) (X. )]2
(Xij Xi. X.j X. )2 +
(X.j X. j )2 +
(Xi. X. i )2
+ six cross product terms.
We must show that all cross product terms
Solutions to Exercises 5.8.2 through 5.8.4 and 5.8.7.
5.8.2. Let (x) denote the density of the standard normal distribution and let 1 < 2 . The likelihood
(x 2 + 1)/(x 1 + 1) if 1 < 2 < 0
(x 2 )/(x 1 + 1)
if 1 < 2 = 0
f (x|2 )
= (x 2 1)/(x 1 +
Solutions to the Exercises of Section 5.7.
5.7.1. If X1 , . . . , XN are i.i.d. from the density f (x|) = ex I(,0)(x), and if V(1) < V(2) < < V(N )
denote the order statistics, then the joint density of V(1), , V(N ) is
fV(1) ,V(N ) (v1 , . . . , vN |)
Solutions to Exercises 5.6.1 through 5.6.8, and 5.6.12.
5.6.1. This problem is invariant under a change of location, gc (x1 , x2) = (x1 + c, x2 + c), and a maximal
invariant is Y = X1 X2 . Under H0 , Y has a N (0, 2) distribution with density
f0 (y) =
Solutions of Exercises 5.5.1 to 5.5.3.
5.5.1. (a) If X C (0, 1), and U = 2X/(1+ X 2 ), then EU = 0 since |U | 1 and U has a symmetric distributin about 0 . To evaluate EU 2 , we make the change of variable, = arctan(x) with dx = (1/ cos2 )d .
EU 2 =
Solutions to the Exercises of Section 5.4.
5.4.1. Suppose is an unbiased test of size that is admissible within the class of unbiased tests.
We are to show that is admissible. Suppose not. Then there is a test that is better than ; that
is, E (X ) E (X )
Solutions to the Exercises of Section 4.8.
4.8.1. We have g (F ) = F (1 ) and g (F ) = F (1 ). We are to show L( (F ), (F ) = L(F, F ).
For the loss of (4.63),
L( (F ), (F ) = sup |F (1 (x) F (1 (x)| = sup |F (y) F (y)| = L(F, F )
since 1 is o
Solutions to the Exercises of Section 1.4.
1.4.1. Proof. (i) From linearity of , either p p or p p . Thus, p p and p p .
(ii) If p1 p2 and p2 p1 , then p2 p1 and p1 p2 . So p1 p2 implies p2 p1 .
(iii) If p1 p2 and p2 p1 , and if p2 p3 and p3 p2 , then fro