hw3_Solution(official)

hw3_Solution(official) - Machine Learning Homework 3...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
Machine Learning Homework 3 Solutions Problem 1 sketch of decision boundaries using K-NN classifiers. N=1 N=3 Problem 2 The key point of this problem is to understand that training samples are randomly generated, and that training set is going to determine your test error. (a) The probability of making an error using KNN is going to be ) , ( ) , ( ) ( 1 2 2 1 C P P C P P e P c c n + = , ) , ( 2 1 C P P c means, when the test point is actually in Hyper Sphere 2 C but somehow the KNN
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
method returns the result as in 1 C . The only chance of KNN making a mistake is when it falls in hypersphere A and there are less than (k-1)/2 sample points in hyper sphere A. 2 1 ) ( 2 1 ) | ( ) ( * ) | ( ) , ( 2 2 1 0 2 1 2 2 1 2 1 = = = = C P C C P P C P C P P C P P k j n j n c c c = = + = 2 1 0 1 2 2 1 2 1 ) , ( ) , ( ) ( k j n j n c c n C C P P C P P e P (b) Show that for this case the 1-nearest neighbor rule has a lower error probability than the k- nearest neighbor rule for all k>1. Since
Background image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 4

hw3_Solution(official) - Machine Learning Homework 3...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online