15-859(B) Machine Learning Theory
Homework # 4.5
Due: end of March, 2007
Some time within the next 3 weeks, come talk with me (in person or by email) about your
plans for your class project.
A typical project might be: reading one or two papers on some to
15-859(B) Machine Learning Theory
Homework # 3
Due: February 22, 2012
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from o
15-859(B) Machine Learning Theory
Homework # 5
Due: April 4, 2012
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from outsi
15-859(B) Machine Learning Theory
Homework # 6
Due: April 30, 2012
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from outs
15-859(B) Machine Learning Theory
Homework # 2
Due: February 13, 2012
Groundrules: Same as before. You should work on the exercises by yourself but may work
with a partner on the problems (just write down who you worked with). Also if you use
material fro
15-859(B) Machine Learning Theory
Avrim Blum
Lecture 4: January 30, 2012
Online Learning contd
* The Perceptron Algorithm
* Perceptron for Approximately Maximizing the Margins
* Kernel Functions
Plan for today: Last time we looked at the Winnow algorithm,
15-859(B) Machine Learning Theory
Homework # 4
Due: March 21, 2012
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from outs
Basic setting
15-859(B) Machine Learning
Theory
Examples are points x in instance space, like Rn.
Assume drawn from some probability distrib:
Lecture 11: More on why large margins
are good for learning. Kernels and
general similarity functions. L1 L2
con
Reinforcement learning
and
Markov Decision Processes (MDPs)
15-859(B)
Avrim Blum
RL and MDPs
General scenario: We are an agent in some state. Have observations, perform actions, get rewards. (See lights, pull levers, get
cookies)
Markov Decision Process:
Todays focus: sample complexity
15-859(B) Machine Learning Theory
Lecture 5: uniform convergence, tail
inequalities, VC-dimension I
Avrim Blum
02/01/12
Basic sample complexity bound recap
If |S| (1/)[ln(|C|) + ln(1/)], then with
probability 1-, all h2C w
15-859(B) Machine Learning Theory
Lecture 15: Learning from noisy data,
intro to SQ model
Avrim Blum
03/19/12
Hoeffding/Chernoff bounds: minimizing training
error will approximately minimize true error: just
need O(1/2) samples versus O(1/).
What about
15-859(B) Machine Learning
Learning finite state
environments
Avrim Blum
04/02/12
Consider the following setting
Say we are a baby trying to figure out
the effects our actions have on our
environment.
Perform actions
Get observations
Try to make an in
Semi-Supervised Learning
15-859(B) Machine Learning Theory
Semi-Supervised Learning
Avrim Blum
02/29/12
The main models we have been studying (PAC,
mistake-bound) are for supervised learning.
Given labeled examples S = cfw_(xi,yi), try to learn a
good p
15-859(B) Machine Learning Theory
Homework # 1
Due: February 1, 2012
Groundrules:
Homeworks will generally consist of exercises, easier problems designed to give you
practice, and problems, that may be harder, and/or somewhat open-ended. You should
do th
15-859(B) Machine Learning Theory
Homework # 3
Due: February 20, 2007
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from o
15-859(B) Machine Learning Theory
Homework # 1
Due: January 30, 2007
Groundrules:
Homeworks will generally consist of exercises, easier problems designed to give you
practice, and problems, that may be harder, and/or somewhat open-ended. You should
do th
15-859(B) Machine Learning Theory
Homework # 4
Due: March 6, 2007
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from outsi
15-859(B) Machine Learning Theory
Homework # 2
Due: February 8, 2007
Groundrules: Same as before. You should work on the exercises by yourself but may work
with a partner on the problems (just write down who you worked with). Also if you use
material from
15-859(B) Machine Learning Theory
Homework # 5
Due: April 10, 2007
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from outs
15-859(B) Machine Learning Theory
Homework # 6
Due: April 24, 2007
Groundrules: Same as before. You should work on the exercises by yourself but may work
with others on the problems (just write down who you worked with). Also if you use material
from outs
Basic Supervised learning setting
w Examples are points x in instance space, like Rn.
w Labeled + or -.
w Assume drawn from some probability
distribution:
15-859(B) Machine Learning
Theory
Lecture 13: Margins, kernels, and
similarity functions
n
n
n
Distr
15-859(B) Machine Learning
15Learning finite state
environments
Avrim Blum
03/29/07
Consider the following setting
Say we are a baby trying to figure out
the effects our actions have on our
environment.
Perform actions
Get observations
Try to make an
Machine learning can be used to
to
15-859(B) Machine Learning Theory
15Lecture 1: intro, models and basic
issues
Avrim Blum
01/16/07
recognize speech, steer cars/robots,
play games,
adapt programs to users,
categorize documents, .
Goals of machine learnin
15-859(A) Machine Learning Theory
* Active Learning Overview (see also slides)
* Margin Based Learning of Linear Separators
There has recently been substantial interest in using unlabeled data together with labeled data
for machine learning. The motivatio