{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

HW_3_ML_LG_20090224_2

# HW_3_ML_LG_20090224_2 - CS 6375 Machine Learning Spring...

This preview shows pages 1–4. Sign up to view the full content.

CS 6375 Machine Learning, Spring 2009 HW3 Gang LIU SID:11458407 Feb.24, 2009 Email: [email protected] 1. KNN The original data: K=1;

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
K=3 Produced by try1.m
If the k is too small, say, k=1, this means the decision boundary may be too “exactly” for the existing data: can memorize the date perfectly, but have worse generation ability for outside data (or unseen date), where the overfiting issues occur. On the other side, large k can be less sensitive to noise, and have better generation ability. 2.KNN a) There are at most (k-1)/2 samples may be wrong, and assume the error rate is p (p=1/2), correct rate is q (q=1- p=p=1/2) where there is j samples wrong: The error is: _ ( 1)/ 2 ( 1)/ 2 _ 0 0 1 ( ) ( 1/ 2) 2 1 ( ) ( ) 2 j n j j n j n n j n k k n n j n j j n n n n p e p q p p p p q j j j j The average probability of error is n p e p e j = = = = = = = = = = b) 1 1 1 ( 1)/ 2 0 ( 1)/ 2 0 1 1 1 1 : ( ) 0 2 2 2 1 1 : ( ) ( ) 2 2 ( ) ( ) NN NN NN k n n n n j k n n n n j n n n n nearest neighbor p e j n k nearest neighbor p e p e j p e p e = = = = = = > = < 4.HMM

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 8

HW_3_ML_LG_20090224_2 - CS 6375 Machine Learning Spring...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document