This preview shows page 1. Sign up to view the full content.
Unformatted text preview: wever, there is no overlap for the two classes and they are
seperated pretty. Thus, FDA is better than PCA here.
Practical e xample of 2_3
In this matlab example we explore FDA using our familiar data set 2_3 which consists of 200 handwritten "2" and 200 handwritten "3".
X is a matrix of size 64*400 and each column represents an 8*8 image of "2" or "3". Here X1 gets all "2" and X2 gets all "3".
>X =X: 120;
>X =X: 2140;
(, 0:0) Next we calculate within class covariance and between class covariance as before.
>s =(u -m2 *(u -m2'
>s =cvX' +cvX';
o(2) We use the first two eigenvectors to project the dato in a two- dimensional space.
>[ d =eg(ivs)*s )
'X Finally we plot the data and visualize the effect of FDA.
> hl o
> od n
> cte(ns120,_a(0:0)'' wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 21/74 10/09/2013 Stat841 - Wiki Cour se Notes FDA projection of data 2_ 3, using Matlab (http://www.mathwork.com) . Map the data into a linear line, and the two classes are seperated perfectly here. An e xte ns ion of Fis he r's dis criminant analys is for s tochas tic proce s s e s
A general notion of Fisher's linear discriminant analysis can extend the classical multivariate concept to situations that allow for function- valued random elements. The
development uses a bijective mapping that connects a second order process to the reproducing kernel Hilbert space generated by its within class covariance kernel. This
approach provides a seamless transition between Fisher's original development and infinite dimensional settings that lends itself well to computation via smoothing and
Link for Algorithm introduction:[ (http://statgen.ncsu.edu/icsa2007/talks/HyejinShin.pdf) ] FD...
View Full Document
This document was uploaded on 03/07/2014.
- Winter '13