A Complete and Tight Average-Case Analysis of
Learning Monomials
Rudiger Reischuk
Institut fur Theoretische Informatik
Med. Universitat zu Lubeck
Wallstra e 40
23560 Lubeck, Germany
reischuk@informatik.mu-luebeck.de
Thomas Zeugmanny
Department of Informat
Learning Bounded-Size Monomials
1
Learning Bounded-Size Monomials
In the last lecture we described and analyzed an algorithm for learning monomials that used a
sample of size O(n + log(1/ )/ ). Recall that the algorithm only used the positive examples in
Learning Singletons, Occams razor, and Learning Monomials
1
Learning Singletons
We next turn to a very simple learning problem, where X = cfw_0, 1n and F is the class of all
singleton functions Fsin . That is, Fsin = cfw_f1 , . . . , fn where for each i
Some Useful Probabilistic Facts
For a random variable we denote the expected value of by Exp[].
Linearity of Expectation
For any two random variables 1 and 2 , Exp[1 + 2 ] = Exp[1 ] + Exp[2 ].
Union Bound
Let E1 and E2 be two events over the same probabil
Learning In the Presence of Noise
In the PAC model we assumed that the sample is labeled by a target (Boolean) function f F .
That is, the sample S = cfw_(a1 , b1 ), . . . , (am , bm ) (where aj X (e.g., X = cfw_0, 1n ) and bj cfw_0, 1)
is such that there
Computational Learning Theory
Lecture Notes for CS 582
Spring Semester, 1991
Sally A. Goldman Department of Computer Science Washington University St. Louis, Missouri 63130 WUCS-91-36
1
Preface
This manuscript is a compliation of lecture notes from the gr
Learning k -term DNF and Proper vs. Non-Proper Learning
We next consider the following family of functions that extend monomials: k -term DNF. These
are functions of the form: T1 T2 . . . Tk where each term Tj s a monomial. By Occams razor,
as long as k i
Introduction, Learning Rectangles, and the PAC Model
There are various models for Machine Learning . In this course we shall mainly (though not
only) focus on one such formal model and a few of its variants. This model is called the Probably
Approximately
Hypothesis Testing and Hypothesis Selection
In this lecture well talk about issues related to learning (and in particular agnostic learning).
Ill refer to them as hypothesis testing and hypothesis selection (or model selection ).
1
Hypothesis Testing
Supp
Learning k -CNF and Learning Decision Lists
We rst observe that for constant k , the algorithm for testing monomials can be extended to
get an algorithm for testing k -CNF functions. These are functions that consist of a conjunction
(And) of clauses, wher
Learning Deterministic Finite Automata with Queries
Let us return back to the standard PAC model where there is an unknown function f from
a known family of functions F . As we discussed in the past, the learning problem may be hard
from a computational p
Boosting
The idea of boosting is the following: Given a weak learning algorithm, which is only ensured
(with high probability) to output a hypothesis that has error bounded away from 1/2 (i.e., at most
1/2 for a small ), we want to obtain a strong learnin
Agnostic Learning
We now turn to what is known as agnostic learning (or unrealizable learning ). In this model,
the target function f : X cfw_0, 1 is unknown, as in the standard PAC model, but furthermore,
nothing is known about the class F that f belongs
ON-CHIP TRANSFORMER MODELING, CHARACTERIZATION,
AND APPLICATIONS IN POWER AND LOW NOISE AMPLIFIERS
A DISSERTATION
SUBMITTED TO THE DEPARTMENT OF ELECTRICAL ENGINEERING
AND THE COMMITTEE ON GRADUATE STUDIES
OF STANFORD UNIVERSITY
IN PARTIAL FULFILLMENT OF