Teach Yourself Logic: Appendix
Some Big Books on Mathematical Logic
Peter Smith
University of Cambridge
August 2, 2014
Pass it on, . . . . Thats the game I want you to learn. Pass it on.
Alan Bennett,
A Problem Course in Mathematical Logic Version 1.6 Stefan Bilaniuk
Department of Mathematics Trent University Peterborough, Ontario Canada K9J 7B8 E-mail address: [email protected]
1991 Mathematics
CS229 Lecture notes
Andrew Ng
Supervised learning
Lets start by talking about a few examples of supervised learning problems.
Suppose we have a dataset giving the living areas and prices of 47 houses
CS229 Lecture notes
Andrew Ng
1
The perceptron and large margin classiers
In this nal set of notes on learning theory, we will introduce a dierent
model of machine learning. Specically, we have so far
CS229 Lecture notes
Andrew Ng
Part VII
Regularization and model
selection
Suppose we are trying to select among several dierent models for a learning
problem. For instance, we might be using a polynom
CS229 Lecture notes
Andrew Ng
Part IX
The EM algorithm
In the previous set of notes, we talked about the EM algorithm as applied to
tting a mixture of Gaussians. In this set of notes, we give a broade
CS229 Lecture notes
Andrew Ng
The k -means clustering algorithm
In the clustering problem, we are given a training set cfw_x(1) , . . . , x(m) , and want to group the data into a few cohesive clusters
CS229 Lecture notes
Andrew Ng
Mixtures of Gaussians and the EM algorithm
In this set of notes, we discuss the EM (Expectation-Maximization) for density estimation.
Suppose that we are given a training
CS229 Lecture notes
Andrew Ng
Part XII
Independent Components
Analysis
Our next topic is Independent Components Analysis (ICA). Similar to PCA,
this will nd a new basis in which to represent our data.
CS229 Lecture notes
Andrew Ng
Part XI
Principal components analysis
In our discussion of factor analysis, we gave a way to model data x Rn as
approximately lying in some k -dimension subspace, where k
CS229 Lecture notes
Andrew Ng
Part X
Factor analysis
When we have data x(i) Rn that comes from a mixture of several Gaussians,
the EM algorithm can be applied to t a mixture model. In this setting, we
CS229 Lecture notes
Andrew Ng
Part XIII
Reinforcement Learning and
Control
We now begin our study of reinforcement learning and adaptive control.
In supervised learning, we saw algorithms that tried t
CS229 Lecture notes
Andrew Ng
Part IV
Generative Learning algorithms
So far, weve mainly been talking about learning algorithms that model
p(y |x; ), the conditional distribution of y given x. For ins
CS229 Lecture notes
Andrew Ng
Part V
Support Vector Machines
This set of notes presents the Support Vector Machine (SVM) learning algorithm. SVMs are among the best (and many believe are indeed the be
CS229 Lecture notes
Andrew Ng
Part VI
Learning Theory
1
Bias/variance tradeo
When talking about linear regression, we discussed the problem of whether
to t a simple model such as the linear y = 0 + 1