Unformatted text preview: iables. When using princomp on 2_3 data in assignment 1, note that we take the transpose of . > la 23
> od _;
> [,soe =picm(';
rnopX) Second, princomp centers X by subtracting off column means.
The third, when , princomp uses as coefficients for principal components, rather than . The following is an example to perform PCA using princomp and SVD respectively to get the same results.
> la 23
> od _
> [ dv=v(1)
> =1*; princomp
>U cr]picm('; Then we can see that y=score, v=U.
us e ful re s ouce s : LDA and QDA in Matlab (http://www.mathworks.com/products/statistics/demos.html?file=/products/demos/shipping/stats/classdemo.html) ,
(http://www.mathworks.com/matlabcentral/fileexchange/189) , (http://seed.ucsd.edu/~cse190/media07/MatlabClassificationDemo.pdf) Trick: Using LDA to do QDA - October 7, 2009
There is a trick that allows us to use the linear discriminant analysis (LDA) algorithm to generate as its output a quadratic function that can be used to classify data. This trick
is similar to, but more primitive than, the Kernel trick (http://en.wikipedia.org/wiki/Kernel_trick) that will be discussed later in the course.
Essentially, the trick involves adding one or more new features (i.e. new dimensions) that just contain our original data projected to that dimension. We then do LDA on our
wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 14/74 10/09/2013 Stat841 - Wiki Cour se Notes new higher- dimensional data. The answer provided by LDA can then be collapsed onto a lower dimension, giving us a quadratic answer. Motivation
Why would we want to use LDA over QDA? In situations where we have fewer data points, LDA turns out to be more robust.
If we look back at the equations for LDA and QDA, we see that in LDA we must estimate , and . In QDA we must estimate all of those, plus another ; the extra estimations make...
View Full Document
This document was uploaded on 03/07/2014.
- Winter '13