This preview shows page 1. Sign up to view the full content.
Tutorial 1
1.
Prove the solution of ridge regression estimator
ˆ
β
R
= min
β
{
n
X
i
=1
(
Y
i

X
>
i
β
)
2
+
λ
p
X
k
=1
β
2
k
is
ˆ
β
R
= (
n
X
i
=1
X
>
i
X
i
+
λI
)

1
n
X
i
=1
X
i
Y
i
or
ˆ
β
R
= (
X
>
X
+
λI
)

1
X
>
Y.
what about
ˆ
β
R
= min
β
{
n
X
i
=1
(
Y
i

X
>
i
β
)
2
+
p
X
k
=1
λ
k
β
2
k
where
λ
k
>
0
, k
= 1
, ..., p
2.
In Example 1.1 of lecture notes chapter 1 (part 1), after removing
x
5
, ﬁt a new linear regression model. Check whether there
is other variables that can be removed (using both T statistics and CV method).
The estimated model is
y
= 0
.
1891

0
.
3424
x
1
+ 0
.
9882
x
2

0
.
2191
x
3

0
.
7473
x
4
SE
(0
.
139) (0
.
149)
(0
.
130)
(0
.
143)
(0
.
114)
x
3
with
t
statistics 1.53, which can be removed.
The GCV values of models with (x1, x2, x3, x4), (x1, x2, x3), (x1, x2, x4), (x1, x3, x4),
( x2, x3, x4) are respectively 0.2509600 0.9080486, 0.2720439, 1.1407614, 0.3181061.
No variable needs to be removed from the model further.
This is the end of the preview. Sign up
to
access the rest of the document.
This note was uploaded on 10/04/2010 for the course STAT ST4240 taught by Professor Xiayingcun during the Fall '09 term at National University of Singapore.
 Fall '09
 XIAYingcun

Click to edit the document details