c11
1
n
i1 ( X i X (n) (Yi Y (n) where cfw_( X iYi ), i 1,., n are paired
n 1
observations, is an unbiased, consistent estimator for c11 .
(a) Unbiasedness: Start with
c11
1
n
i1 X iYi X (n)Y (n) X i Y (n) Yi X (n) and observe that
n 1
E[c11 ]
1
n
6
8. (a) First we dene the dummy RV , + then we have = and = .
Then ( ) = ( ) for 0 1 0 and the Jacobian is given as
= det
= det
1
=
Since and are independent
( ) = () ()
= 2 (+)
2
=
(by the problem statement)
So ( ) = 2 Then integrating out the v
18
Now via the residue method, evaluating the complex integral
I
1
() 1
see Appendix A.3 (discrete time),
[] =
2
[] =
=
+1
()+1
+
+
+1
1
()+1 0
+
The answer is zero for 0 so the full answer is
1 +1
()+1 0
+
[] =
0
0
Alternatively, we can use th
24
Thus by the denition, the random sequence [] is WSS.
(b) Consider the third moment function
[ 3 [] = [( cos + sin )3 ]
= [3 ] cos3 + 3[2 ] cos2 sin
+3[ 2 ] cos sin2 + [ 3 ] sin3
= [3 ] cos3 + 3[2 ][] cos2 sin
+3[][ 2 ] cos sin2 + [ 3 ] sin3
= [3 ]
21
(c) For the covariance function of we have
[ ] =
=
=
=
=
X
()0
X
()0
X
()0
X
()0
X
=0
[][] [ ]
[][] 2 [ ( )]
[][] 2 [( ) ( )]
[][] 2 [( ) ( )]
[][ ( )] 2
= ( )
where () = [] the WSS covariance function. Continuing on,
[] =
X
=0
=
[][ ] 2
X
2
=max(0)
39
51.
[] = [ 1] + []
1
= [ 2] + [ 1] + []
.
.
.
2
= 0 + [1] + [2] + + []
X
[]
a sum of independent RVs.
=
=1
(a) For the mean,
[] , []
"
#
X
=
[]
=1
=
X
[ []
=1
=
X
=1
=
For the variance,
2
, Var[]
#
"
X
[]
= Var
=1
=
=
X
=1
X
Var[ [] since the
33
43. We will use the simplied notation here.
( +1 100 ) =
( +1 100 )
(+1 100 )
(6)
Now, by the chain rule of probability theory, the numerator of Equation 6 is equal to
( +1 ) (+2 +1 ) (+3 +2 +1 ) (100 99 98 )
100
Y
(+ +1 ) by the Markov prope
42
But there is a general property of conditional expectation that for two random vectors Y and
Z satisfying Z = g(Y) that
[Y]Z] = [Z]
Applying this general result here, we take Z = ([] [0]) and Y = ([ + ] [ +
1] [0]) Then, we get
[ + + 1][] [0] = [
X (n) (1/ n) i 1 X i . Squared error consistency requires
n
that lim n E[( X (n) X ) 2 ] 0 . Expanding and taking expectations before taking limits
2
2
2
2
2
2
2
yields E[( X (n) X ) 2 ] X (1/ n) X 2 X X (1/ n) X (1/ n) X (1/ n) X .
2
Clearly for a fini
square RV with m degrees of freedom is (1 2t ) m / 2 for t 1/ 2 . The MGF of the sum of k i.i.d
Chisquare RVs is the kproduct of their MGFs, which in this case is
k
(
n k ) / 2
2
(1 2t ) i1 i
for t 1/ 2 . Hence S is Chisquare as n k where we recalled t
7.8. From the given information, the admissible strategies are on the curve shown below:
R (d , 2 )
R(d , 1 )
0.5R(d , 1 ) 0.5 R(d , 2 ) c *
The value of c* is the point at which they touch. For the curve we have
dR(d , 2 )
dR(d , 2 )
[ R (d ,1 ) 1]
while
7.3 The test is (n) 1 i 1 X i c for rejecting the hypothesis. Here c is obtained from
n
1
0.05
2 2 / n
c
1 x 2
1
exp
dx . When 1 , and converting to the standard, Normal,
2 / n
we get
0.95 FSN ( n (c 1 ) FSN (1.645) . Hence c 1 1.645 / n 1 as n .
22
From Eq. (5.29 ?),
Y (1 ) =
where
 = mag det
X ( )
1

1
1
.
.
.
1
.
.
11
.
.
= mag det .
.
.
1
=  det A
1
.
.
.
1
.
.
.
But, if B = A1 then det = 1 det A and so
Y (1 ) =  det AX ( )
1
38. This problem is a special case of problem 5.37 with
7.14. a) The LRT ( for acceptance of the hypothesis) is
(2 2 ) 1/ 2 exp 0.5( X 1) 2 / 2
(2 )
2 1/ 2
exp 0.5 X /
2
2
exp ( X 1/ 2) / 2 k .
b) take natural logs of both sides, obtain X 2 ln k 1/ 2 c where c k 2 1/ 2 .
c) 0.02 2 2
c
1/ 2
exp 0.5( x 1) 2
n/2
(n 1)
(m 1) Fn 1,m 1
A(n, m)
( m n) / 2
(n 1)
1 (m 1) Fn 1, m 1
m/2
(m 1)
(n 1) Fm 1, n 1
A(m, n)
.
( m n) / 2
(m 1)
1 (n 1) Fm 1,n 1
7.25 Under H 2 we have pi p1i all i ; hence
E[V  H 2 ] E
l
i 1
2
(np0i ) 1 (ni2 2nni p0i n 2 p0i )
2
12
(d) Set Y ,
[1 ]
[2 ]
=
then Y is distributed as ( K ) where
[1 ]
[2 ]
and
K =
[1 1 ] [1 2 ]
[2 1 ] [2 2 ]
.
The pdf of vector Y is given as
Y (y) =
1
1
exp (y ) K1 (y )
2
(2) det K 12
17. We need the joint pdf (2 1 ; 10 5) Now
[10] =
10
X
7.32. We compute for ordered cojoined sequence d=14. From Example 76.58, the critical value
is d 0 6.3 . Since d 0 d we accept the hypothesis that P1 P2 . Yet it is obvious that P1 generates
even numbers while P2 generates odd numbers. The run test is n
9
Here T is given as
3 1
A
5
65
25
= 0
0
T =
25
1
25
0
0
25
1
0
0
25
0
0
25
65
11. To prove the Corollary to Theorem 8.11, note that the sequence of events is decreasing here,
i.e. 1 2 3 , so equivalently the sequence of complementary sets is increasi
17
We then complete the square as
"
#
1 3 2
1 + 3 2
+
(3 2 ) + (2 1 ) = 2 2
2
2
2
2
so
!
(3 1 )2
1
exp
=
22
22
=
p
22
2(1 3 )2 4
exp
2
2
1
(1 3 )2
exp
4
22
22. The covariance function is just a function of the dierence of the two times and . Also the