{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

class17 (1)

# class17 (1) - Regularization Networks 9.520 Tomaso Poggio...

This preview shows pages 1–10. Sign up to view the full content.

Regularization Networks 9.520 Class 17, 2003 Tomaso Poggio

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Plan Radial Basis Functions and their extensions Additive Models Regularization Networks Dual Kernels Conclusions
About this class We describe a family of regularization techniques based on radial kernels K and called RBFs. We introduce RBF extensions such as Hyper Basis Functions and characterize their relation with other techniques includ- ing MLPs and splines.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Radial Basis Functions Radial Basis Functions, as MLPs, have the universal ap- proximation property. Theorem: Let K be a Radial Basis Function function and I i the n -dimensional cube [0 , 1] n . Then finite sums of the form f ( x ) = N X i =1 c i K ( x - x i ) are dense in C [ I i ]. In other words, given a function h C [ I i ] and > 0, there is a sum, f ( x ), of the above form, for which: | f ( x ) - h ( x ) | < for all x I n .
Notice that RBF correspond to RKHS defined on an infi- nite domain. Notice also that RKHS do not in general have the same approximation property: RKHS generated by a K with an infinite countable number of strictly positive eigenvalues are dense in L 2 but not necessarily in C ( X ), though they can be embedded in C ( X ).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Density of a RKHS on a bounded domain (the non-RBF case) We first ask under which condition is a RKHS dense in L 2 ( X, ν ). 1. when L K is strictly positive the RKHS is infinite dimensional and dense in L 2 ( X, ν ). 2. in the degenerate case the RKHS is finite dimensional and not dense in L 2 ( X, ν ). 3. in the conditionally strictly positive case the RKHS is not dense in L 2 ( X, ν ) but when completed with a finite number of polynomials of appropriate degree can be made to be dense in L 2 ( X, ν ).
Density of a RKHS on a bounded domain (cont) Density of RKHS – defined on a compact domain X – in C ( X ) (in the sup norm) is a trickier issue that has been answered very recently by Zhou (in preparation). It is however guaranteed for radial kernels K for K continuous and integrable, if density in L 2 ( X, ν ) holds (with X the infinite domain). These are facts for radial kernels and unrelated to RKHS properties span K ( x - y ) : y R n is dense in L 2 ( R n ) iff the Fourier transform of K goes not vanish on set of positive Lebesque measure (N. Wiener). span K ( x - y ) : y R n is dense in C ( R n ) (topology of uniform convergence) if K C ( R n ), K L 1 ( R n ).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Some good properties of RBF Well motivated in the framework of regularization theory; The solution is unique and equivalent to solving a linear system; Degree of smoothness is tunable (with λ ); Universal approximation property; Large body of applied math literature on the subject; Interpretation in terms of neural networks (?!); Biologically plausible; Simple interpretation in terms of smooth look-up table ; Similar to other non-parametric techniques, such as nearest neigh- bor and kernel regression (see end of this class).
Some not-so-good properties of RBF Computationally expensive ( O ( 3 )); Linear system to be solved for finding the coefficients often badly ill-conditioned;

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 46

class17 (1) - Regularization Networks 9.520 Tomaso Poggio...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online