A Few Useful Things to Know about Machine Learning
Pedro Domingos
Department of Computer Science and Engineering
University of Washington
Seattle, WA 98195-2350, U.S.A.
pedrod@cs.washington.edu
ABSTRACT
Machine learning algorithms can gure out how to perf

1
CS229 Problem Set #1
CS 229, Autumn 2015
Problem Set #1: Supervised Learning
Due in class (9:30am) on Wednesday, October 14.
Notes: (1) These questions require thought, but do not require long answers. Please be as
concise as possible. (2) If you have a

FSA CLUB LEAD TRAINING
PART II: SPEAKING ABOUT MOZILLA
V1.1 Jan 2015
Hello aspiring Club Lead!
Congratulations on completing the first
module! You are awesome!
Now, this module will be loaded with
information. Prepare to read a lot! But you
might already

FSA CLUB LEAD TRAINING
PART III: HANDLING YOUR FIREFOX CLUB
v1.1 Jan 2015
Hello aspiring Club Lead!
Congratulations on getting to the last part
of the modules! Hope you still have the
energy to finish the CLT!
Now, the last module might contain new
infor

Programming Exercise 1: Linear Regression
Machine Learning
Introduction
In this exercise, you will implement linear regression and get to see it work
on data. Before starting on this programming exercise, we strongly recommend watching the video lectures

1
ISyE 6761 Stochastic Processes I Fall 2012
(revised 8/21/12)
Class Times and Place: MW 9:3011:00 am, IC217.
Review Classes: F 9:3011:00 am, IC217.
Course Website: www.isye.gatech.edu/sman/courses/6761.
Instructor: Dave Goldsman; Groseclose 433; (404)894

1
Probability & Statistics Review
Dave Goldsman
Updated 8/21/12
1
Getting Started The Gamblers Ruin
Each time a gambler plays, he wins $1 with probability p and loses $1 with probability
1 p = q. Each play is independent. Suppose he starts with $i. Find t

3. Poisson Processes
3.
Poisson Processes (12/10/12, see Adult and
Baby Ross)
Exponential Distribution
Poisson Processes
Poisson and Exponential Relationship
Generalizations
1
3. Poisson Processes
Exponential Distribution
Denition: The continuous RV X has

CS229 Lecture notes
Andrew Ng
Part IV
Generative Learning algorithms
So far, weve mainly been talking about learning algorithms that model
p(y|x; ), the conditional distribution of y given x. For instance, logistic
regression modeled p(y|x; ) as h (x) = g

CS229 Lecture notes
Andrew Ng
Part V
Support Vector Machines
This set of notes presents the Support Vector Machine (SVM) learning algorithm. SVMs are among the best (and many believe is indeed the best)
o-the-shelf supervised learning algorithm. To tell t

CS229 Lecture notes
Andrew Ng
Part X
Factor analysis
When we have data x(i) Rn that comes from a mixture of several Gaussians,
the EM algorithm can be applied to t a mixture model. In this setting,
we usually imagine problems were the we have sucient data

CS229 Lecture notes
Andrew Ng
Supervised learning
Lets start by talking about a few examples of supervised learning problems.
Suppose we have a dataset giving the living areas and prices of 47 houses
from Portland, Oregon:
Living area (feet2 )
2104
1600
2

CS229 Lecture notes
Andrew Ng
Part IX
The EM algorithm
In the previous set of notes, we talked about the EM algorithm as applied to
tting a mixture of Gaussians. In this set of notes, we give a broader view
of the EM algorithm, and show how it can be appl

CS229 Lecture notes
Andrew Ng
Part VI
Learning Theory
1
Bias/variance tradeo
When talking about linear regression, we discussed the problem of whether
to t a simple model such as the linear y = 0 +1 x, or a more complex
model such as the polynomial y = 0

CS229 Problem Set #2
1
CS 229, Autumn 2015
Problem Set #2: Naive Bayes, SVMs, and Theory
Due in class (9:00am) on Wednesday, October 28.
Notes: (1) These questions require thought, but do not require long answers. Please be as
concise as possible. (2) If