{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

class13 - Approximation Error and Approximation Theory...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Approximation Error and Approximation Theory Federico Girosi Center for Basic Research in the Social Sciences Harvard University and Center for Biological and Computational Learning MIT [email protected]
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
1 Plan of the class Learning and generalization error Approximation problem and rates of convergence N-widths “Dimension independent” convergence rates
Background image of page 2
2 Note These slides cover more extensive material than what will be presented in class.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
3 References The background material on generalization error (first 8 slides) is explained at length in: 1. P. Niyogi and F. Girosi. On the relationship between generalization error, hypothesis complexity, and sample complexity for Radial Basis Functions. Neural Computation , 8:819–842, 1996. 2. P. Niyogi and F. Girosi. Generalization bounds for function approximation from scattered noisy data. Advances in Computational Mathematics , 10:51–80, 1999. [1] has a longer explanation and introduction, while [2] is more mathematical and also contains a very simple probabilistic proof of a class of “dimension independent” bounds, like the ones discussed at the end of this class. As far as I know it is A. Barron who first clearly spelled out the decomposition of the generalization error in two parts. Barron uses a different framework from what we use, and he summarizes it nicely in: 3. A.R. Barron. Approximation and estimation bounds for artificial neural networks. Machine Learning , 14:115–133, 1994. The paper is quite technical, and uses a framework which is different from what we use here, but it is important to read it if you plan to do research in this field. The material on n -widths comes from: 4. A. Pinkus. N-widths in Approximation Theory , Springer-Verlag, New York, 1980. Although the book is very technical, the first 8 pages contain an excellent introduction to the subject. The other great thing about this book is that you do not need to understand every single proof to appreciate the beauty and significance of the results, and it is a mine of useful information. 5. H.N. Mhaskar. Neural networks for optimal approximation of smooth and analytic functions. Neural Computation , 8:164–177, 1996.
Background image of page 4
4 6. A.R. Barron. Universal approximation bounds for superpositions of a sigmoidal function. IEEE Transaction on Information Theory , 39:3, 930–945, 1993. 7. F. Girosi and G. Anzellotti. Rates of convergence of approximation by translates A.I. Memo 1288, Artificial Intelligence Laboratory, Massachusetts Institute of Technology, 1992. For a curious way to prove dimension independent bounds using VC theory see: 8. F. Girosi. Approximation error bounds that use VC-bounds. In Proc. International Conference on Artificial Neural Networks , F. Fogelman-Souli` e and P. Gallinari, editors, Vol. 1, 295–302. Paris, France, October 1995.
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
5 Notations review I [ f ] = X × Y V ( f ( x ) , y ) p ( x , y ) d x dy I emp [ f ] = 1 l l i = 1 V ( f ( x i ) , y i ) f 0 = arg min f I [ f ] , f 0 ∈ T f H = arg min f H I [ f ] ^ f H,l = arg min f H I emp [ f ]
Background image of page 6
6 More notation review I [ f 0 ] = how well we could possibly do I [ f H ] = how well we can do in space H I [ ^ f H,l ] =
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}