Problem 1: (9 points) K-Nearest Neighbor Classification
Consider the following set of training data, consisting of two-dimensional real-valued features and a binary class value, for a k-nearest-neighbors classifier. Positive data are shown as circles, negative as squares.
(1) Sketch the decision boundary for k = 1. Show your work and justify your answer in a few sentences (2-3).
(2) Sketch the decision boundary for k = 5, in the relevant part of the feature space (i.e., near the training data). Again, show your work and justify your answer in a few sen- tences.
(3) Sketch the basic shape you would expect to see for the error rate on training data, and on test data, as a function of increasing k = 1, . . . , 7. For the training error rate, indicate the values (error rates) of the endpoints (k = 1 and k = 7).
Recently Asked Questions
- I don't know where to begin to come up with a discussion board post for this week. The professor said that it does not have to be a power point presentation.
- The temperature of a chemical reaction ranges between 20 degrees Celsius and 160 degrees Celsius. The temperature is at its lowest point when t = 0, and the
- Hello, can someone please solve and provide detailed step by step instructions? (2x + y)2