An SVM can exploit the kernel trick to define a margin in feature space given n

An svm can exploit the kernel trick to define a

This preview shows page 36 - 46 out of 57 pages.

. An SVM can exploit the kernel trick to define a margin in feature space; given n training samples: ˆ z ( x * ) = sign ˆ w 0 + n X i = 1 α i z i κ ( x i , x * ) ! . What does a kernelized linear classifier’s separating hyperplane look like in the original feature space? COS424/SML 302 Classification methods February 20, 2019 36 / 57
Image of page 36

Subscribe to view the full document.

Example: predicting gender from height, siblings 0 2 4 6 60 65 70 75 80 85 Height (inches) Siblings as.factor(sex) 0 1 COS424/SML 302 Classification methods February 20, 2019 37 / 57
Image of page 37
Example: SVM, linear kernel 0 1 2 3 4 5 7 57 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 77 84 Sampled height (inches) Siblings 0.00 0.25 0.50 0.75 1.00 round(sex) factor(roun 0 1 accuracy = 0 . 88 COS424/SML 302 Classification methods February 20, 2019 38 / 57
Image of page 38

Subscribe to view the full document.

Example: SVM, Gaussian kernel 0 1 2 3 4 5 7 57 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 77 84 Sampled height (inches) Siblings 0.00 0.25 0.50 0.75 1.00 round(sex) factor(round 0 1 accuracy = 0 . 89 COS424/SML 302 Classification methods February 20, 2019 39 / 57
Image of page 39
SVMs: summary For K number of classes, n training samples, p features: Classifier Naive Bayes KNN SVMs Model based? Y N N Classifier type? generative cluster discriminative Kernelizeable? N Y Y Additive? Y non-linear dist non-linear kern Parameters? N K c Multiclass? Y Y N Interpretable? Y N linear kernel Missing data? Y N N Training? O(np) None O ( n 2 ) Test? O(Kp) O(np) O ( | SVs | p ) COS424/SML 302 Classification methods February 20, 2019 40 / 57
Image of page 40

Subscribe to view the full document.

Decision trees A decision tree partitions the feature space in such as way as to have each partition have as little class uncertainty as possible. Information gain : the difference in Shannon entropy in a subspace and a partition of that subspace. Shannon entropy : a measure of unpredictability. High entropy Low entropy Shannon entropy and a biased coin What is the bias of the most unpredictable coin? What is the bias of the least unpredictable coin? COS424/SML 302 Classification methods February 20, 2019 41 / 57
Image of page 41
Decision trees: build from a training data set To build a decision tree, given a data set D = { ( x i , z i ) } n Repeat until all samples at a node are in the same class: For each feature j Find the information gain from splitting on j Create a decision node that splits on feature j * with greatest information gain Recur on the sublists obtained by splitting on j * , and add those nodes as children of node j * Once a tree is built, leaf nodes can be pruned. Why is this a good idea? COS424/SML 302 Classification methods February 20, 2019 42 / 57
Image of page 42

Subscribe to view the full document.

Decision tree: illustration Starting with our data set D : X 1 X 2 X 1 X 2 X 1 X 1 >= a a X The best feature to split on in terms of information gain is X 1 . X 1 X 2 X 1 X 2 X 2 X 1 X 1 > a b X 1 < a X COS424/SML 302 Classification methods February 20, 2019 43 / 57
Image of page 43
Decision tree: illustration X 2 a We can build the first node to partition the data in the tree: X 1 X 2 X 1 X 2 X 1 X 2 X 1 X 1 >= a a b X 1 < a X 2 X 2 >= b X 2 < b COS424/SML 302 Classification methods February 20, 2019 44 / 57
Image of page 44

Subscribe to view the full document.

Decision tree: illustration In the X 1 < a partition, we have no uncertainty in the class; we can stop there.
Image of page 45
Image of page 46
  • Spring '09
  • Machine Learning, K-nearest neighbor algorithm, Support vector machine, Statistical classification

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Ask Expert Tutors You can ask 0 bonus questions You can ask 0 questions (0 expire soon) You can ask 0 questions (will expire )
Answers in as fast as 15 minutes