COS424SML 302 Classification methods 19 57 Kernelized feature

Cos424sml 302 classification methods 19 57 kernelized

This preview shows page 19 - 28 out of 57 pages.

COS424/SML 302 Classification methods February 20, 2019 19 / 57
Image of page 19

Subscribe to view the full document.

Kernelized feature vector Project point x into the D -dimensional feature space by computing the similarity between x and each centroid μ 1: D via κ ( · , · ). We can use this kernelized feature vector for any type of analysis by replacing x with φ ( x ). Note: for D centroids, computation is O ( nD ) . COS424/SML 302 Classification methods February 20, 2019 20 / 57
Image of page 20
Kernelized machine and linear classifiers The kernelized feature vector represents the similarity of each x to each centroid. This provides a simple way to define a non-linear decision boundary using a linear classifier for well chosen centroids. A Gaussian kernel κ with four centroids and a linear classifier COS424/SML 302 Classification methods February 20, 2019 21 / 57
Image of page 21

Subscribe to view the full document.

Kernelized K-nearest neighbor classifier We can kernelize the KNN classifier: Given data set D = { ( x 1 , y 1 ) , . . . , ( x n , y n ) } , and a kernel κ ( x , x 0 ) For x * , compute the vector κ ( x i , x * ) for all x i , i = 1 : n . Find the K nearest neighbors by kernel similarity: samples most similar to x * in the feature space. Then ˆ z * is the most frequent class label from K nearest neighbors. This method exploits the kernel trick : computing kernel function is O ( n ), but similarity is quantified in a (possibly) high dimension feature space. COS424/SML 302 Classification methods February 20, 2019 22 / 57
Image of page 22
K nearest neighbors: example with K=3, Gaussian kernel 0 1 2 3 4 5 7 57 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 77 84 Sampled height (inches) Siblings 0.00 0.25 0.50 0.75 1.00 round(sex) factor(round 0 1 accuracy = 0 . 86 COS424/SML 302 Classification methods February 20, 2019 23 / 57
Image of page 23

Subscribe to view the full document.

Kernelized K nearest neighbors: summary K classes, n samples, p features: Classifier Naive Bayes KNN Model based? Y N Classifier type? generative cluster Kernelizeable? N Y Additive? Y non-linear distance Parameters? N K Multiclass? Y Y Interpretable? Y N Missing data? Y N Training? O(np) None Test? O(Kp) O(np) COS424/SML 302 Classification methods February 20, 2019 24 / 57
Image of page 24
Support Vector Machines (SVMs) A SVM classifier finds a linear separator (hyperplane) between training samples that maximizes the margin between the two classes of samples. The SVM is a type of “large margin classifier” Linear classifier Max5margin linear classifier COS424/SML 302 Classification methods February 20, 2019 25 / 57
Image of page 25

Subscribe to view the full document.

SVMs for binary classification A support vector machine (SVM) is used for binary classification ( z ∈ {- 1 , 1 } ) We can fit an SVM to training data w is normal vector to the hyperplane ; w 0 the offset from the origin to the hyperplane. The separating hyperplane is characterized by: w T x + w 0 margin w 0 w 2/||w|| COS424/SML 302 Classification methods February 20, 2019 26 / 57
Image of page 26
SVMs for binary classification The distance from the two hyperplanes (margins) w T x + w 0 = 1 and w T x + w 0 = - 1 is 2 / || w || . So the width of the margin, for a set of support vectors , has width 2 / || w || .
Image of page 27

Subscribe to view the full document.

Image of page 28
  • Spring '09
  • Machine Learning, K-nearest neighbor algorithm, Support vector machine, Statistical classification

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Ask Expert Tutors You can ask 0 bonus questions You can ask 0 questions (0 expire soon) You can ask 0 questions (will expire )
Answers in as fast as 15 minutes