This preview shows page 1. Sign up to view the full content.
Unformatted text preview: rr PSEIR lg,cef =casf(ape sml,gop 'ier)
ro, OTRO, op
>f=srnf' =%+gx%*' k l1,l2)
g%*+gy, , ()
a(ape:2)) Plot the decision boundary by LDA. See the previous example for more information about LDA in matlab. wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 33/74 10/09/2013 Stat841 - Wiki Cour se Notes From this figure, we can see that the results of Logistic Regression and LDA are very similar. 2009.10.21
Multi-Clas s Logis tic Regres s ion
Our earlier goal with logistic regression was to model the posteriors for a 2 class classification problem with a linear function bounded by the interval [0,1]. In that case our
model was, We can extend this idea to the more general case with K- classes. This model is specified with K - 1 terms where the Kth class in the denominator can be chosen arbitrarily. The posteriors for each class are given by, Seeing these equations as a weighted least squares problem makes them easier to derivate.
Note that we still retain the property that the sum of the posteriors is 1. In general the posteriors are no longer complements of each other as in true in the 2 class problem
where we could express
. Fitting a Logistic model for the K>2 class problem isn't as 'nice' as in the 2 class
problem since we don't have the same simplification. Multi-clas s kernel logis tic regres s ion
Logistic regression (LR) and kernel logistic regression (KLR) have already proven their value in the statistical and machine learning community. Opposed to an empirically
risk minimization approach such as employed by Support Vector Machines (SVMs), LR and KLR yield probabilistic outcomes based on a maximum likelihood argument. It
seems that this framework provides a natural extension to multiclass classification tasks, which must be contrasted to the commonly used...
View Full Document
- Winter '13