CS534 Written Homework Assignment 2 Due Oct 21 in class, 2014
1. (Decision theory). Consider a case where we have learned a conditional probability distribution P (yx).
Suppose there are only two classes, and let p0 = P (y = 0x) and p1 = P (y = 1x). Co
CS534 Written Homework Assignment 2 Due Oct 21 in class, 2014
1. (Decision theory). Consider a case where we have learned a conditional probability distribution P (yx).
Suppose there are only two classes, and let p0 = P (y = 0x) and p1 = P (y = 1x). Co
CS534 Written Homework Assignment 4
Submitted by: Meghamala Sinha
1.a) The decision tree drawn is shown in fig 1. I have used binary labelling to classify A,B,C,D,E,F into
corresponding input regions.
30
X2
25
C
D
F
20
15
10
A
B
E
5
0
X1
5
10
15
20
25
30
A Course in Machine Learning
by Hal Daum III
Machine learning is the study of algorithms that learn from
data and experience. It is applied in a vast variety of
application areas, from medicine to advertising, from
military to pedestrian. Any area in whic
Linear classification models:
Perceptron
CS534Machine learning
Linear Classifier
x2

+

+
+
+
+
+



+
x1
We have discussed Logistic Regression
LR learns
:
which yields a linear decision boundary
We will now look at a different paradigm
for learn
Neural Networks
CS534
Motivations
Analogy to biological systems, which are the
best examples of robust learning systems
Consider human brain:
Neuron switching time ~ 103 S
Scene recognition can be done in 0.1 S
There is only time for about a hundred
Dimension Reduction
CS534
Why dimension reduction?
High dimensionality large number of features
E.g., documents represented by thousands of words,
millions of bigrams
Images represented by thousands of pixels
Redundant and irrelevant features (not all
CS534 Homework Assignment 1 Due Oct 9th in class, 2014
Written assignment
1. Consider two coins, one is fair and one is not. The unfair coin has a 1/10 probability for head. Now
you close your eyes and pick a random coin (each coin has a 50% probability b
CS534 Written Homework Assignment 3 Due Oct 31st 4pm, 2014
(BOOOOO!)
Submit your homework in the homework dropbox outside instructors oce or via email to [email protected]
Late submission should only be submitted via email.
1. Consider the following tra
CS534 Implementation Assignment 1 Due 11:59PM Oct 12th, 2014
General instructions.
1. The following languages are acceptable: Java, C/C+, Matlab, Python and R.
2. You can work in team of up to 3 people. Each team will only need to submit one copy of the s
Introduction to Constrained Optimization
Duality and KKT Conditions
Pratik Shah
cfw_pratik.shah [at] lnmiit.ac.in
The LNM Institute of Information Technology
www.lnmiit.ac.in
February 13, 2013
LNMIIT
MLPR
Convex Optimization
1/14
Geometry of the Problem
L
CS534MachineLearning
Lecture1:
AbasicintroductiontoML
WhatisMachinelearning
Performance P
Task T
Learning Algorithm
Experience E (Data)
Machinelearningstudiesalgorithmsthat
Improveperformance P
atsometask T
basedonexperience E
MachinelearninginComputer
Decision Tree
Decision Tree for
Playing Tennis
(outlook=sunny, wind=strong, humidity=normal, ? )
DT for prediction Csection risks
Characteristics of Decision Trees
Decision trees have many appealing properties
Similar to human decision process, easy to
RF uses an ensemble of a large number of unpruned decision trees and it is
known to be very accurate in classification, comparable to Support Vector
Machines (SVM) [3
By using bagging, each node of trees only selects a small subset of features for
the spl
1) Let i be the weighted error of hi,
N
D
So, we get i =
j=1
(j)I(hi(Xj) != yj),
i
where I() is the indicator=
1 argument is true
0 otherwise
According to the update rule of Adaboost, assuming that the weights of the correct examples
are multiplied by e,
1) Let i be the weighted error of hi,
So, we get i =
=1 i(j)I(hi(Xj) != yj),
where I() is the indicator=
1 argument is true
0 otherwise
According to the update rule of Adaboost, assuming that the weights of the correct examples
are multiplied by e, and th
function [Result] = LDACluster()
Result=[];
c1=[];
c2=[];
B=importdata('walking.train.labels');
train=importdata('walking.train.data');
j=1;
for i=1:length(B)
if B(i)= 0
c1(i,:)=train(i,:);
else
c2(i,:)=train(i,:);
end
j=j+1;
end
z=length(c1);
k=1;
for i=
CS534 Implementation Assignment 2 Due Oct 28th 11:59PM, 2014
General instruction.
1. The following languages are acceptable: Java, C/C+, Matlab, Python and R.
2. You can work in team of up to 3 people. Each team will only need to submit one copy of the so
CS534 Homework Assignment 1 Due Oct 9th in class, 2014
Written assignment
1. Consider two coins, one is fair and one is not. The unfair coin has a 1/10 probability for head. Now
you close your eyes and pick a random coin (each coin has a 50% probability b
Support Vector Machines
CS534  Machine Learning
Perceptron Revisited: Linear Separators
Binary classification can be viewed as the task
of separating classes in feature space:
w x + b = 0
+
wx+b>0
+
+
+
+
+
+
+
+
wx+b<0
f(x) = sign(w x + b)
Linear Separ
Support Vector Machine (cont.)
CS534  Machine Learning
Summarization So Far
We demonstrated that we prefer to have
linear classifiers with large margin.
We formulated the problem of finding the
maximum margin linear classifier as a
quadratic optimizati