Homework Four, due Tue 2/21
CSE 250B
Hand in your homework in hard copy.
The projects for CSE 250B are due on March 21. For this homework, youll turn in a detailed project
proposal. Heres what it should include:
1. Title
2. If you are developing and test
Homework Zero, due Tue 1/17
CSE 250B
1. Acquire access to Matlab (or Octave, www.octave.org) and start getting comfortable with it. Before embarking upon the rest of this homework set, you will need to download the OCR data from the course webpage. There
Homework Three, due Tue 2/14
CSE 250B
Hand in your homework in hard copy.
Present your results clearly and succinctly using tables and graphs as appropriate. Discuss your results
in precise and lucid prose. Content is king, but looks matter too!
Please
Homework Two, due Tue 2/7
CSE 250B
1. Perceptron over the line. As we discussed in class, the perceptron will not converge if the data are not
linearly separable. However, it is still possible to bound the number of mistakes the perceptron makes
in some c
Lecture
Lecture 14:
Kernels
Perceptron in action
+8
+9
5_
3
+
2
+
4_
10
_
7
_
+6
+1
w=0
while some (x,y) is
misclassified:
w = w + yx
w = x(1) x(7)
Sparse
representation:
[(1,1), (7,-1)]
When does this fail?
When data is not linearly separable.
[1] System
CSE 250B Machine learning
The mind-reading game
[written by Y. Freund and R. Schapire]
Repeat 200 times:
Computer guesses whether youll type 0/1
You type 0 or 1
The computer is right much more than half the time
Strategy: computer predicts next keystroke
Lecture 4: Decision trees
Example
Petal width (PW)
Iris data: 3 classes (setosa, viriginica, versicolor)
Petal length (PL)
Decision trees
Simple
Multiclass
Not perfect, but close
Comprehensible to
humans
How to build them?
K-d trees
Rapidly partition the
Lecture
Lecture 12:
Generative vs. discriminative
Iris data revisited
Iris data: 3 classes (setosa, viriginica, versicolor), 4 features
Projection onto
two of the features
Generative model
Model each class by a Gaussian
Immediately makes
it possible to fl
Lecture 3: Perceptron
Recap: Perceptron algorithm
Datapoints (x1,y1), (x2, y2), , xt 2 Rd, yt 2 cfw_+1,-1, are separable
by a hyperplane through the origin
w=0
for t = 1,2,
if yt(w xt) 0:
w = w + yt xt
Claim Suppose
(i) |xt| R for all t
(ii) There is some