This preview shows page 1. Sign up to view the full content.
Unformatted text preview: rgin actually achieved by the perceptron algorithm.
This is not the maximum margin but may nevertheless be indicative of how hard the
problem is. Given X and θ, γgeom can be calculated in MATLAB as follows:
gamma_geom = min(abs(X*theta / norm(theta)))
a
b
We get γgeom = 1.6405 and γgeom = 0.0493, again with some variation due to the order
in which one selects the training examples. These margins appear to be consistent with
our analysis, at least in terms of their relative magnitude. The bound on the number of
updates holds for any margin, maximum or not, but gives the tightest guarantee with
the maximum margin. (d) Given X , R can be calculated in MATLAB as
R = max(sqrt(sum(X.^2,2)))
b
Then, Ra = 200.561 and Rb = 196.3826. Using these, and γ ∗ aeom = 5.5731 and γ ∗ geom =
g
0.3267 evaluated below, the theoretical bounds on the number of perceptron updates for
the two problems are � �2 � �2 ka ≤ Ra /γ ∗ a
geom
ka ≤ Rb /γ ∗ b
geom ≈ 1295 (1) ≈ 361333 (2) Cite as: Tommi Jaakkola, course materials for 6.867 Machine Learning, Fall 2006. MIT OpenCourseWare
(http://ocw.mit.edu/), Massachusetts Institute of Technology. Downloaded on [DD Month YYYY]. 200 180 180 160 160 140 140
120
120
100
100
80
80
60
60
40 40 20 20
0 0 20 40 60 80 100 120 140 160 (a) Dataset A 180 0 0 20 40 60 80 100 120 140 160 180 (b) Dataset B Figure 1: The decision boundary θT x for the two datasets is show in black. The two classes are
indicated by the colors red and blue, respectively.
The bounds are not very tight. This is in part because they must hold for any order
in which you chose to go through the training examples (and in part because they are
simple theore...
View
Full
Document
This document was uploaded on 03/20/2014 for the course EECS 6.867 at MIT.
 Fall '06
 TommiJaakkola
 Machine Learning

Click to edit the document details