This preview shows page 1. Sign up to view the full content.
Unformatted text preview: that would be one, i.e. the probability is 1. If the point is far
from the center, then the probability ( function value) will be close to zero, that is, it’s less likely. Therefore, we can treat
as the probability of a particular
feature given data.
When we have those features, then is the linear combination of the features. Hence, any of the weights , which is equal to
will appear given those features. Therefore, the weight
shows the probability of class membership given feature. , tells us how likely this Hence, we have found a probabilistic point of view to look at RBF Network!
Note There are some inconsistencies with this probabilistic point of view. There are no restrictions that force
and 1. So if least squares is used to solve this, to be between 0 cannot be interpreted as a probability. As ide
One way to produce a feature space is LDA
Suppose, we have n data points
. Each data point has d features. And these n data points consist of the X matrix,
wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 51/74 10/09/2013 Stat841 - Wiki Cour se Notes Also, we have feature space, If we want to solve a regression problem for the input data, we don’t perform Least Square on this
matrix, we do Least Square on the feature space, i.e. on the
matrix. The dimensionality of
is M by n. We can add
which is not any function of
Now, we still have n data points, but we define these n data points in terms of a new set of features. So, originally, we define our data points by d features, but now,
we define them by M features. And what are those M features telling us?
Let us look at the first column of
matrix. The first entry is
, and so on, until the last entry is
. Suppose each of these
checks the similarity of the data point with its center. Hence, the new set of features are actually representing M centers in our data set, and for each
data point, its new features check how this point is similar to the first c...
View Full Document
- Winter '13