ECE/CS 532
Note on Gram-Schmidt Orthogonalization
Suppose we have a set of vectors v1 , v2 , . . . , vn Rm , with m n. The vectors may be close to
each other or even colinear. For example, two of the vectors, say vi and vj , may point in almost the same
d
ECE/CS 532
Note on Kernel Methods
Recall the standard least squares problem
2
min b Ax
xRn
.
We have seen in class and homework that sometimes nonlinear, rather than linear, combinations of features
to can produce better predictions. The easiest way to do
% nonlocal means
% rebecca willett, 9/15/2014
function y = nlm(x,h,sig)
h = floor(h/2); % h must be odd
[n,m] = size(x);
x_pad = padarray(x,[h h],'replicate','both');
y = zeros(size(x);
sig = sig*(2*h+1)^2;
fprintf(' i = ')
for i = 1:n
for j = 1:m
patch
% create data for clustering
clear
close all
n=100;
cnt=1;
while cnt < n
a = 2*rand(2,1)-1;
if norm(a) < 1/3
A(:,cnt) = a;
cnt=cnt+1;
elseif (norm(a) > .5 & norm(a) < .6)
A(:,cnt) = a;
cnt=cnt+1;
end
end
scatter(A(1,:),A(2,:),'*')
axis([-1 1 -1
ECE/CS 532
Note on Support Vector Machines
Consider the classication problem depicted in Figure 1 below. Given a set of labeled training examples,
the goal is to learn a linear classier. Each example in the training dataset consists of two features and
a
% training data and LS
clear
close all
% generate data
m = 1000;
n = 2;
b = zeros(m,1);
figure(1);
subplot(121); hold on;
for i=1:m
a = 2*rand(2,1)-1;
A(i,:)=a';
b(i) = sign(a(1)^2+a(2)^2-.5);
if b(i)=1
plot(a(1),a(2),'b.');
else
plot(a(1),a(2),'r
% SVD example
clear
close all
B = [5 1;5 -1; 5 1];
n=50;
X = B*randn(2,n);
X = X + .0*randn(size(X);
[u,d,v]=svd(X);
scatter3(X(1,:),X(2,:),X(3,:),'.')
hold on
pc1 = d(1,1)/sqrt(n)*[0 0 0;u(1,1) u(2,1) u(3,1)];
t=plot3(pc1(:,1),pc1(:,2),pc1(:,3),'m');
set
clear
close all
n = 64;
% noise free image
x = double(phantom(n)*256;
% add noise
y = x+randn(size(x)*15;
% denoise by non-local means
sig = 150; % weighing factor; bigger sig = more smoothing
h = 5; % patch sidelength
x_nlm = nlm(y,h,sig);
% denoise by d
CS/ECE/ME 532
Homework 8: Hinge Loss and SVMs
1. Reconsider the basketball player classication problem discussed in class and the lecture notes. Recall
we are trying to predict whether a person is a basketball player based on height. The training data
con
CS/ECE/ME 532
Homework 7: Classication and Kernel Methods
Consider the classication problem discussed in class, where the goal is to design a classier based on
the training data shown in the plot below. The Matlab le training_data.m generates this dataset
CS/ECE/ME 532
Homework 6: The SVD and Least Squares
1. Recall the face emotion classication problem from HW 3. Design and compare the performances of the
classiers proposed in a and b, below. In each case, divide the dataset into 8 equal sized subsets (e.
CS/ECE/ME 532
Homework 5: The SVD
In this homework set you will work with an analyze the dasaset jesterdata.mat, which is available
on the moodle site. The dataset contains an m = 100 by n = 7200 dimensional matrix X. Each row of X
corresponds to a joke,
ECE/CS 532
Homework 4: Orthogonality
1. Consider the matrix and vector
3 1
1
A = 0 3 and b = 3 .
0 4
1
a. By hand, nd two orthonormal vectors that span the plane spanned by columns of A.
b. Make a sketch of these vectors and the columns of A in three di
ECE/CS 532
Homework 3: Least Squares
1. Consider the following matrix and vector:
1 1
A = 1 1 ,
1 1
1
b = 1 .
0
a. Find the solution x to minx b Ax 2 .
b. Make a sketch of the geometry of this particular problem in R3 , showing the columns of A, the
pla
ECE/CS 532
Homework 2: Vectors and Matrices
1. Let X = [x1 x2 xn ] Rpn , where xi Rp is the ith column of X. Consider the matrix
C =
XX T
.
n
a. Express C as a sum of rank-1 matrices (i.e., columns of X times rows of X T ).
b. Assuming x1 , x2 , . . . , x
CS229 Lecture notes
Andrew Ng
Part V
Support Vector Machines
This set of notes presents the Support Vector Machine (SVM) learning algorithm. SVMs are among the best (and many believe are indeed the best)
o-the-shelf supervised learning algorithm. To tell
NOTES ON THE SVM AND HINGE LOSS
ANIRUDDHA BHARGAVA
1. The Formulation of the Classification Problem
We are interested in solving the problem of classifying points in Rn as either having a label +1 or
1. More formally, we are given data points ai Rn and th
CS/ECE/ME 532
Practice Midterm Exam 1
Name:
1.
2.
3.
(total score)
1. Answer the following questions. Make sure to explain your reasoning.
a. What is the rank of the following matrix:
1 2 3
A = 2 4 6
3 6 9
b. Are the columns of the following matrix linea