7. Let X and Y be random variables with joint Lebesgue density
211126”, a: > 0, y 6 (0,1);
0, otherwise.
17(33 y) = {
a) Find the marginal density for Y.
b) Find the conditional density for X given Y = y.
c) Find P(X > 1|Y = y), E[X|Y = y], and E[X2|Y =
Math 6740 (S 2015), Homework 3:
Problem 3.1. (refer to Corollary 9, p. 25 in the current lecture manuscript "Unbiased
Estimation of Risk"). Assume f is a xed function in L2 (0; 1) and (n) is the vector
of its rst n Fourier coe cients, that is (n) = (hf; '
2. Verify the relation (15.1).
The estimator can be written as
( x)
( x)( x)
d
d
(3 + 2 )2
(3 + 2 )2
(x) =
=x+
( x)
( x)
d
d
(3 + 2 )2
(3 + 2 )2
x4 ()
2 d
3 + ( + x)2
.
=x+
x4 ()
2 d
(3 + + x)2
As x ,
x4 ()
2 d 1
(3 + + x)2
by dominated convergence.
Math 6740 (S 2015), Homework 4:
Problem 4.1. Consider observations X = (Xj )nj=1 in the white noise model
Xj =
j
+n
1=2
j,
(1)
j = 1; : : : ; n
where j are i.i.d. standard normal, and 2 Rn is unknown. Consider Bayesian
estimation of with a prior distribut
Math 6740 (S 2015), Homework 5:
Problem 5.1. (refer to Special case 2, p. 37 in the current handout). In the situation of Theorem 13, show that the following simplied linear estimator d is also
asymptotically minimax: set
1=(2 +1)
=n
1=(2 +1)
M
1=(2 +1)
(
Solutions
Chapter 16: Asymptotic Optimality
1. Consider a regression model in which Yi =xi + i , i = 1, 2, . . . , with
the i i.i.d. from N (0, 2 ), and assume that i=1 x2i < . Let Qn denote
n denote the joint
the joint distribution of Y1 , . . . , Yn if
4
c) Give empirical Bayes estimators for i combining the simple empirical estimates for and in (b) with the Bayes estimate for i when
and are known in (a).
a) The Bayes estimators are
i = E[i |Xi ] =
+ Xi
,
+m
i = 1, . . . , p.
b) By smoothing,
EXi = EE
Unbiased Estimation of Risk in Nonparametric Estimation
2 Rp from independent observations
Consider the estimation problem for a parameter
Xi
N ( i ; 1) , i = 1; : : : ; p.
Set X = (X1 ; : : : ; Xp ). If (X) is an estimator of and h(X) = X
(X) then under