Learning a radial basis function network from data is

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: uced. >n-0 <2; >x-e(,0lnt=) <sq11,eghn; >apa-.; lh<25 >bt<17; ea-.5 >y-lh+eaxromn; <apabt*+nr() >po(~,ph1,ld3 cx05 mi=Oeftig) ltyx c=6 w=, e=., an'vritn'; >aln(lh,bt,cl'le) bieapa ea o=bu'; >lnssln(,y,cl=2; ie(piex ) o ) More details on this topic later on. Figure 26.2: Overfitting Model Selection(Stein's Unbiased Risk Estimate)- November 11th, 2009 Model Selection Model selection (http://en.wikipedia.org/wiki/Model_selection) is a task of selecting a model of optimal complexity for a given data. Learning a radial basis function network from data is a parameter estimation problem. One difficulty with this problem is selecting parameters that show good performance on both training and testing data. In principle, a model is selected to have parameters associated with the best observed performance on training data, although our goal really is to achieve good performance on unseen testing data. Not surprisingly, a model selected on the basis of training data does not necessarily exhibit comparable performance on the testing data. When squared error is used as the performance index, a zero- error model on the training data can always be achieved by using a sufficient number of basis functions. But, training error and testing error do not demonstrate a linear relationship. In particular, a smaller training error does do not necessarily result in a smaller testing error. In practice, one often observes that, up to a certain point, the model error on testing data tends to decrease as the training error decreases. However, if one attempts to decrease the training error too far by increasing model complexity, the testing error often can take a dramatic increase. The basic reason behind this phenomenon is that in the process of minimizing training error, after a certain point, the model begins to over- fit the training set. Over- fitting in this context means fitting the model to training data at the expense of losing generality. In the extreme form, a set of training data points can be modeled exactly with radial basi...
View Full Document

This document was uploaded on 03/07/2014.

Ask a homework question - tutors are online