CSE 250: Machine Learning Theory
Spring 2016
Homework 1
Due on: Wed April 20
Instructor: Raef Bassily
Instructions and Notes
For your proofs, you may use any result covered in class, and its analysis
CSE 250: Machine Learning Theory
Spring 2016
Homework 2
Instructor: Raef Bassily
Due on: Wed May 4
Instructions and Notes
For your proofs, you may use any result covered in class, and its analysis, b
CSE 250: Machine Learning Theory
Spring 2016
Homework 1
Instructor: Raef Bassily
Due on: Wed April 20
Instructions and Notes
For your proofs, you may use any result covered in class, and its analysis
Godels Incompleteness Theorem
CSE 205A Lecture Notes
Last updated on 2007-05-05
Suppose we have the vocabulary V = cfw_, +, , , =, < for N, the natural numbers. We will show
cfw_ | N |= , that is, th
Finite Models
May 9, 2007
Having finished our discussion on classical logic on arbitrary models, we
now turn specifically to finite models. This is a whole subarea of logic, called
Finite Model Theory
CSE 250: Machine Learning Theory
Spring 2016
Bonus Quiz
Due on: Wed May 11
Instructor: Raef Bassily
Instructions
For your proofs, you may use any result covered in class, and its analysis, but please
CSE 250: Machine Learning Theory
Spring 2016
Part 3
Instructor: Raef Bassily
3.1
Scribe: Andrew Leverentz
Vapnik-Chervonenkis (VC) Dimension, continued
Definition 3.1 (Set Shattering). A hypothesis cl
CSE 250: Machine Learning Theory
Spring 2016
Homework 3 (Project)
Instructor: Raef Bassily
Due on: Wed June 1
Please, read the following instructions carefully
The homework is a mini-project where it
CSE 250: Machine Learning Theory
Spring 2016
Part 2 - The PAC Model
Instructor: Raef Bassily
2.1
Scribe: Andrew Leverentz
Statistical Learning Framework
The basic setup for statistical learning:
Input
Introduction
Neural networks for solving large-scale tasks require many
parameters leading to a danger of over-fitting
One way to deal with that is to use convolutions of the same filter
many units
Introduction to
Convolutional Networks
Lecture 7
Rob Fergus
New York University
As told by Gary Cottrell
UCSD
SUPERVISED
Recurrent Neural Net
Boosting
Convolutional Neural Net
Perceptron
Neural Net
SV
First-Order Logic
1
Syntax
Domain of Discourse
The domain of discourse for first order logic is FO structures or models. A
FO structure contains
Relations
Functions
Constants (functions of arity 0)
Proceedings of the Twenty-Ninth AAAI Conference on Innovative Applications (IAAI-17)
Using Deep and Convolutional Neural Networks for
Accurate Emotion Classication on DEAP Dataset
Samarth Tripathi
Shr
Ingredients of Logic
Domain of Discourse: What we want to talk about.
Language: To formulate statements. Assign meanings in terms of domain of discourse.
Deductive System: For drawing inferences withi
Neural Networks for Machine Learning
Lecture 6a
Overview of mini-batch gradient descent
Georey Hinton
with
[email protected] Srivastava
Kevin Swersky
Reminder: The error surface for a linear neuron
T
The Tricks of the Trade
(Read LeCun et al., Efficient Backprop,
through section 4.7)
Garrison W. Cottrell
Gary's Unbelievable Research Unit (GURU)
Computer Science and Engineering Department
Temporal
Neural Networks for
Pattern Recognition
CSE 253
A.K.A. The deep nets course
Statistical Pattern Recognition
CSE 253
Complementary courses, different content:
Anything taught by Manmohan Chandraker,
Ka
Backprop:
Representations,
Representations,
Representations
Garrison W. Cottrell
Gary's Unbelievable Research Unit (GURU)
Computer Science and Engineering Department
Temporal Dynamics of Learning Cent
CSE 250: Machine Learning Theory
Spring 2016
Mid-term Exam
Date: May 9, 2016
Instructor: Raef Bassily
Instructions
Maximum Grade = 20 points. Totat points = 28 points.
Prove all your claims, you may
CSE 250: Machine Learning Theory
Spring 2016
Homework 2
Due on: Wed May 4
Instructor: Raef Bassily
Instructions and Notes
For your proofs, you may use any result covered in class, and its analysis, b
CSE 250: Machine Learning Theory
Spring 2016
Part 5 - Introduction to Convex Learning
Instructor: Raef Bassily
5.1
Scribe: Andrew Leverentz
Convex Learning
Now we will discuss some generalizations to
CSE 250: Machine Learning Theory
Spring 2016
Lecture 12
Instructor: Raef Bassily
12.1
Scribe: Andrew Leverentz
Convex Learning, continued
Recall the setup for regression:
X Rm ,
Y R,
H = cfw_hw : X Y,
Other topics for future consideration
& Concluding remarks
More on Linear Classifiers:
Perceptron & SVMs
More into Linear Classifiers
Linearly separable case (i.e., realizability holds):
(1)
(1)
(n)
(
CSE 250: Machine Learning Theory
Spring 2016
Lecture 6
Instructor: Raef Bassily
Scribe: Andrew Leverentz
There are two challenges we still need to address:
1. Realizability: Weve been sweeping the rea
CSE 250: Machine Learning Theory
Spring 2016
Lecture 1
Instructor: Raef Bassily
Scribe: Andrew Leverentz
We will discuss various concentration inequalities:
Markov inequality
Chebyshev inequality
C
CSE 250: Machine Learning Theory
Spring 2016
Lecture 4
Instructor: Raef Bassily
Scribe: Andrew Leverentz
Example 4.1 (Accuracy/confidence analysis for axis-aligned rectangles). First, fix 0 < and
0 <
CSE 250: Machine Learning Theory
Spring 2016
Lecture 15
Instructor: Raef Bassily
Scribe: Andrew Leverentz
Before today, most of our learning paradigms were based on empirical risk minimization (ERM)
o