1
© Eric Xing @ CMU, 2006-2008
1
Machine Learning
Machine Learning
10
10-
701/15
701/15-
781, Fall 2008
781, Fall 2008
Computational Learning Theory II
Computational Learning Theory II
Eric Xing
Eric Xing
Lecture 11, October 13, 2008
Reading: Chap. 7 T.M book, and outline material
© Eric Xing @ CMU, 2006-2008
2
Last time: PAC and Agnostic
Learning
z
Finite H, assume target function c
∈
H
z
Suppose we want this to be at most
δ
. Then
m
examples suffice:
z
Finite H, agnostic learning: perhaps c
not
in H
z
Î
z
with probability at least (1-
δ
) every h in H satisfies

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
2
© Eric Xing @ CMU, 2006-2008
3
What if H is not finite?
z
Can’t use our result for infinite H
z
Need some other measure of complexity for H
– Vapnik-Chervonenkis (VC) dimension!
© Eric Xing @ CMU, 2006-2008
4
What if H is not finite?
z
Some Informal Derivation
z
Suppose we have an H that is parameterized by d real numbers. Since we are
using a computer to represent real numbers, and IEEE double-precision floating
point (double's in C) uses 64 bits to represent a floating point number, this means
that our learning algorithm, assuming we're using double-precision floating point,
is parameterized by 64d bits
z
Parameterization