Machine Learning
Lesson 06 Notes
Quiz: The Best Line
C: Ok, so today we're going to talk about support vector machines and I'm going to do
something unexpected. I'm going to start out the beginning of
Machine Learning
Lesson 07 Notes
Quiz: Computational Learning Theory
M: Hey, Charles.
C: Oh, hi Michael.
M: It's funny running into to you here.
C: It is. It's always funny running in to you over the
Machine Learning
Lesson 01 Notes
Difference between Classification and Regression
C: Today we are going to talk about supervised learning. But, in particular what
we're going to talk about are two kin
Machine Learning
Lesson 02 Notes
Quiz: Regression
M: So, let me tell you about regression. We are in this section of the class, talking about
supervised learning. In supervised learning we can take ex
Machine Learning
Lesson 03 Notes
Neural Networks
M: I'm excited to tell you about neural networks today. You may be familiar with neural networks
because you have one, in your head.
C: I do?
M: Well,
KernelMethodsandSVMsExtension
The purpose of this document is to review material covered in
Machine Learning 1 Supervised Learning regarding support
vector machines (SVMs). This document also provide
Linear Regression
Udacity
What is a Linear Equation?
Equation of a line : y = mx+b, where m is the slope of the line and (0, b) is the y-intercept.
Notice that the degree of this equation is 1. In hig
InstanceBasedLearningExtension
In this document, we review information from the Machine
Learning 1 Supervised Learning course regarding instance
based learning,withafocusonthe
k
nearestneighborsalgor
NeuralNetworks
The purpose of this document is to review neural networks,
discuss training rules and provide an example illustrating
backpropagation.
PerceptronsandthePerceptronRule:
GradientDescent/
CS 478 Machine Learning: Homework 1
Suggested Solutions
1
kNN Decision Boundaries
(a)
(b) (c) It would be classied as circle. Adding one point is enough, say a cross at exactly (1,-1). (d) It wou
CS 478 Machine Learning: Homework 2
Suggested Solutions
1
Separable or Not?
(a) See the following tree:
(b) If we draw a large enough sample, there would be at least two points on each of the fou
CS 478 Machine Learning: Homework 3
Suggested Solutions
1
SVM inside-out, the Primal (15 points)
(a) It is easy to check that for the soft margin SVM, w = 0, = 1 is always a feasible solution. Al
Problem 1: Viterbi Algorithm
[50 points]
(a) [5 points] By denition, y,t1 is the probability of the most probable English sequence corresponding to the rst t 1 Greek observations (x1 , x2 , ., xt1 ) t