14-svm_kernels - Outline Foundations of Artificial...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Foundations of Artificial Intelligence Support Vector Machines and Kernels CS472 – Fall 2007 Thorsten Joachims Outline Transform a linear learner into a non-linear learner Kernels can make high-dimensional spaces tractable Kernels can make non-vectorial data tractable Non-Linear Problems Problem: • some tasks have non-linear structure • no hyperplane is sufficiently accurate How can SVMs learn non-linear classification rules? Î Extending the Hypothesis Space Idea: add more features Î Learn linear rule in feature space. Example: Î The separating hyperplane in feature space is degree two polynomial in input space. Example Input Space: (2 attributes) Feature Space: (6 attributes) Dual (Batch) Perceptron Algorithm
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 Dual SVM Optimization Problem Primal Optimization Problem Dual Optimization Problem Theorem: If w* is the solution of the Primal and α * is the solution of the Dual, then Kernels Problem: Very many Parameters! Polynomials of degree
Background image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 02/19/2008 for the course CS 4700 taught by Professor Joachims during the Fall '07 term at Cornell University (Engineering School).

Page1 / 2

14-svm_kernels - Outline Foundations of Artificial...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online