This preview shows page 1. Sign up to view the full content.
Unformatted text preview: . The number of parameters grows
quardratically w.r.t dimension.
7. LDA estimate parameters more efficiently by using more information about data and samples without class labels can be also used in LDA.
8. As logistic regression relies on fewer assumptions, it seems to be more robust.
9. In practice, Logistic regression and LDA often give the similar results.
By e xample
Now we compare LDA and Logistic regression by an example. Again, we use them on the 2_3 data.
>po (ape1201,sml(:0,) '';
>po (ape21401,sml(0:0,) '.)
ape21402, r'; First, we do PCA on the data and plot the data points that represent 2 or 3 in different colors. See the previous example for more details.
; Group the data points.
wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 32/74 10/09/2013 Stat841 - Wiki Cour se Notes >[,e,tt]=mri(apegop;
ape] Now we use mnrfit
(http://www.mathworks.com/access/helpdesk/help/toolbox/stats/index.html?/access/helpdesk/help/toolbox/stats/mnrfit.html&http://www.google.cn/search?hl=zhCN&q=mnrfit+matlab&btnG=Google+%E6%90%9C%E7%B4%A2&aq=f&oq=) to use logistic regression to classfy the data. This function can return B which is a
matrix of estimates, where each column corresponds to the estimated intercept term and predictor coefficients. In this case, B is a
304 This is our . So the posterior probabilities are:
. The classification rule is:
>f=srnf' =%+gx%*' B1,B2,B3)
a(ape:2)) Plot the decision boundary by logistic regression. This is a decision boundary by logistic regression.The line shows how the two classes split.
View Full Document
This document was uploaded on 03/07/2014.
- Winter '13