Stat841f09 - Wiki Course Notes

# Theoretically suppose we can estimate some vector

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: QDA less robust with fewer data points. Theoretically Suppose we can estimate some vector where such that is a d- dimensional column vector, and (vector in d dimensions). We also have a non- linear function Using our trick, we create two new vectors, that we cannot estimate. and such that: and We can then estimate a new function, . Note that we can do this for any x and in any dimension; we could extend a matrix to a quadratic dimension by appending another matrix squared, to a cubic dimension with the original matrix cubed, or even with a different function altogether, such as a dimension. matrix with the original By Example Let's use our trick to do a quadratic analysis of the 2_3 data using LDA. &gt; la 23 &gt; od _; &gt; [,sml]=picm('; &gt;U ape rnopX) &gt; sml =sml(,:) &gt; ape ape:12; We start off the same way, by using PCA to reduce the dimensionality of our data to 2. &gt; Xsa =zrs404; &gt; _tr eo(0,) &gt; Xsa(,:)=sml(,) &gt; _tr:12 ape::; &gt; fri140 &gt;o =:0 frj12 o =: Xsa(,+)=Xsa(,)2 _trij2 _trij^; ed n ed n This projects our sample into two more dimensions by squaring our initial two dimensional data set. &gt; gop=oe(0,) &gt; ru ns401; &gt; gop2140 =2 &gt; ru(0:0) ; &gt; [ls,err PSEIR lg,cef =casf(_tr Xsa,gop 'ier) &gt; cas ro, OTRO, op of] lsiyXsa, _tr ru, lna'; &gt; sm(ls=gop &gt;u cas=ru) as= n 35 7 We can now display our results. &gt; k=cef12.os; &gt; of(,)cnt &gt; l=cef12.ier &gt; of(,)lna; &gt; f=srnf' =%+gx%*+g()2%*y^' k l1,l2,()l4) &gt; pit(0 g%*+gy%*x^+g()2, , () ()l3,(); &gt; epo(,mnsml(,),mxsml(,),mnsml(,),mxsml(,)]; &gt; zltf[i(ape:1) a(ape:1) i(ape:2) a(ape:2)) wikicour senote.com/w/index.php?title= Stat841&amp;pr intable= yes 15/74 10/09/2013 Stat841 - Wiki Cour se Notes The plot shows the quadratic decision boundary obtained using LDA in the four-dimensional space on the 2_ 3.mat data. Counting the blue and red points that are on the wrong side of the decision boundary, we can confirm that we have correctly classified 375 data points. Not only does LDA give us a better result than it d...
View Full Document

## This document was uploaded on 03/07/2014.

Ask a homework question - tutors are online