Stat841f09 - Wiki Course Notes

Stat841f09 Wiki Course Notes

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: rong results were shown in the early 1990s. Definition: Support Vector Machines (SVM) (http://en.wikipedia.org/wiki/Support_vector_machine) are a set of related supervised learning methods used for classification and regression. A support vector machine constructs a maximum margin hyperplane or set of hyperplanes in a higher or infinite dimensional space. The set of points near the class boundaries, support vectors, define the model which can be used for classification, regression or other tasks. Optimal Se pe rating Hype rplane Figure 28.2 An example with two classes separated by a hyperplane. The blue line is the least squares solution, which misclassifies one of the training points. Also shown are the black separating hyperplanes found by the perceptron (http://en.wikipedia.org/wiki/Perceptron) learning algorithm with different random starts. We can see the data points can be separated by a linear boundary are in two classes in . Suppose a dataset is indeed linearly separable, then there exits infinitely many possible separating hyperplanes including the black lines in the figure as two of them for training data. However, which solution is the best when we introduce the new data. Aside: The blue line is the least squares solution to the problem,obtained by regressing the on (with intercept); the line is given by response . This least squares solution does not do a perfect job in separating the points, and makes one error. This is the same boundary found by linear discriminant analysis, in light of its equivalence with linear regression in the two- class case. Figure 28.2 Classifiers such as (28.4) that compute a linear combination of the input features and return the sign were called percept rons in the engineering literature in the late 1950s. Identifications: Hyperplane: separate two classes Margin: the distance between the hyperplane and the closest point. where Note: since distance is positive, if the data is on Data points: side the distance is we can classify points as . If...
View Full Document

This document was uploaded on 03/07/2014.

Ask a homework question - tutors are online