Machine Learning
Lesson 06 Notes
Quiz: The Best Line
C: Ok, so today we're going to talk about support vector machines and I'm going to do
something unexpected. I'm going to start out the beginning of this with a quiz.
M: With a quiz.
C: Yes, with a quiz.
Machine Learning
Lesson 07 Notes
Quiz: Computational Learning Theory
M: Hey, Charles.
C: Oh, hi Michael.
M: It's funny running into to you here.
C: It is. It's always funny running in to you over the interwebs.
M: So, today, I have the pleasure of telling
Machine Learning
Lesson 01 Notes
Difference between Classification and Regression
C: Today we are going to talk about supervised learning. But, in particular what
we're going to talk about are two kinds of supervised learning, and one particular way to
do
Machine Learning
Lesson 02 Notes
Quiz: Regression
M: So, let me tell you about regression. We are in this section of the class, talking about
supervised learning. In supervised learning we can take examples of inputs and outputs and
based on that we are g
Machine Learning
Lesson 03 Notes
Neural Networks
M: I'm excited to tell you about neural networks today. You may be familiar with neural networks
because you have one, in your head.
C: I do?
M: Well, yeah. I mean, you have a network neurons. Like, you kno
KernelMethodsandSVMsExtension
The purpose of this document is to review material covered in
Machine Learning 1 Supervised Learning regarding support
vector machines (SVMs). This document also provides a
general overview of some extensions to that which w
Linear Regression
Udacity
What is a Linear Equation?
Equation of a line : y = mx+b, where m is the slope of the line and (0, b) is the y-intercept.
Notice that the degree of this equation is 1. In higher dimensions when we talk about linear
equations we a
InstanceBasedLearningExtension
In this document, we review information from the Machine
Learning 1 Supervised Learning course regarding instance
based learning,withafocusonthe
k
nearestneighborsalgorithm.
A brief extension beyond what was discussed in th
NeuralNetworks
The purpose of this document is to review neural networks,
discuss training rules and provide an example illustrating
backpropagation.
PerceptronsandthePerceptronRule:
GradientDescent/DeltaRule:
NeuralNetworks:
Backpropagation:
In lesson t
CS 478 Machine Learning: Homework 1
Suggested Solutions
1
kNN Decision Boundaries
(a)
(b) (c) It would be classied as circle. Adding one point is enough, say a cross at exactly (1,-1). (d) It would be classied as circle. The ve closest points ar
CS 478 Machine Learning: Homework 2
Suggested Solutions
1
Separable or Not?
(a) See the following tree:
(b) If we draw a large enough sample, there would be at least two points on each of the four positions. Since there is no noise in the label,
CS 478 Machine Learning: Homework 3
Suggested Solutions
1
SVM inside-out, the Primal (15 points)
(a) It is easy to check that for the soft margin SVM, w = 0, = 1 is always a feasible solution. All the constraints are satised. (b) Consider the it
Problem 1: Viterbi Algorithm
[50 points]
(a) [5 points] By denition, y,t1 is the probability of the most probable English sequence corresponding to the rst t 1 Greek observations (x1 , x2 , ., xt1 ) that end
with the English character y. So, the most prob