This preview shows page 1. Sign up to view the full content.
Unformatted text preview: 1. Give an example to show that the decision boundary resulting form the nearestneighbor rule is piecewise linear. 2. (6.5 in the text book) Consider the set of twodimensional vectors from two categories:
T T T T T T x1 = (1, 0) , x2 = (0,1) , x3 = (0, -1) , x4 = (0, 0) , x5 = (0, 2) , x6 = (0, -2) , T x7 = ( -2, 0) , x1 , x2 , x3 belong to class 1, and x4 , x5 , x6 , x7 belong to class 2. (a) Plot the decision boundary resulting from the nearestneighbor rule. (b) Find the sample means: m1 and m2 . Plot the decision boundary corresponding to classifying x by assigning it to the category of the nearest sample mean. 3. Consider the Euclidean metric in d dimensions: 4. Show that there are two equivalent approaches to do PCA: solving eigenproblem related (as introduced in the textbook) and solving to Covariance matrix . Discuss the relationship between eigenproblem related to Gram matrix results of the two approaches. Explain the benefits we can gain from this conclusion in computing when actually implementing PCA. 5. Discuss the necessity of centering process (making the mean of samples zero) for PCA and give examples to illustrate the failure of PCA without centering on samples. ...
View Full Document
- Spring '10