Learning Bounded-Size Monomials
1
Learning Bounded-Size Monomials
In the last lecture we described and analyzed an algorithm for learning monomials that used a
sample of size O(n + log(1/ )/ ). Recall
Learning Singletons, Occams razor, and Learning Monomials
1
Learning Singletons
We next turn to a very simple learning problem, where X = cfw_0, 1n and F is the class of all
singleton functions Fsin .
Some Useful Probabilistic Facts
For a random variable we denote the expected value of by Exp[].
Linearity of Expectation
For any two random variables 1 and 2 , Exp[1 + 2 ] = Exp[1 ] + Exp[2 ].
Union B
Learning In the Presence of Noise
In the PAC model we assumed that the sample is labeled by a target (Boolean) function f F .
That is, the sample S = cfw_(a1 , b1 ), . . . , (am , bm ) (where aj X (e.
Computational Learning Theory
Lecture Notes for CS 582
Spring Semester, 1991
Sally A. Goldman Department of Computer Science Washington University St. Louis, Missouri 63130 WUCS-91-36
1
Preface
This m
Learning k -term DNF and Proper vs. Non-Proper Learning
We next consider the following family of functions that extend monomials: k -term DNF. These
are functions of the form: T1 T2 . . . Tk where eac
Introduction, Learning Rectangles, and the PAC Model
There are various models for Machine Learning . In this course we shall mainly (though not
only) focus on one such formal model and a few of its va
Hypothesis Testing and Hypothesis Selection
In this lecture well talk about issues related to learning (and in particular agnostic learning).
Ill refer to them as hypothesis testing and hypothesis sel
Learning k -CNF and Learning Decision Lists
We rst observe that for constant k , the algorithm for testing monomials can be extended to
get an algorithm for testing k -CNF functions. These are functio
Learning Deterministic Finite Automata with Queries
Let us return back to the standard PAC model where there is an unknown function f from
a known family of functions F . As we discussed in the past,
Boosting
The idea of boosting is the following: Given a weak learning algorithm, which is only ensured
(with high probability) to output a hypothesis that has error bounded away from 1/2 (i.e., at mos
Agnostic Learning
We now turn to what is known as agnostic learning (or unrealizable learning ). In this model,
the target function f : X cfw_0, 1 is unknown, as in the standard PAC model, but further
ON-CHIP TRANSFORMER MODELING, CHARACTERIZATION,
AND APPLICATIONS IN POWER AND LOW NOISE AMPLIFIERS
A DISSERTATION
SUBMITTED TO THE DEPARTMENT OF ELECTRICAL ENGINEERING
AND THE COMMITTEE ON GRADUATE ST