SPR_LectureHandouts_Chapter_03_Part4_LDA

# SPR_LectureHandouts_Chapter_03_Part4_LDA - Pattern...

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Pattern Recognition ECE-8443 Chapter 3, Part 4 Linear Discriminant Analysis Electrical and Computer Engineering Department, Mississippi State University. 1 Chapter 3 Saurabh Prasad Pattern Recognition Electrical and Computer Engineering Department Principal Component Analysis • Previously, we saw that we can make a d’-dimensional linear projection in an MSE sense, where d’<d onto a hyperplane d' x = m + ∑ ai ei i =1 • The criterion function n d' k =1 2 i =1 J d ' (e) = ∑ (m + ∑ a ki ei ) − x k is maximized when the vectors e1, e2, e3,…, ed’ are the d’ eigenvectors of the scatter matrix having the largest eigenvalues. 2 Chapter 3 Saurabh Prasad Pattern Recognition Electrical and Computer Engineering Department Principal Component Analysis • PCA Algorithm (mapping a d-dimensional space onto a d’ dimensional subspace: • Estimate the scatter matrix, S, using available training data n S = ∑ (x k − m)(x k − m)t k =1 • Perform an eigenvalue decomposition of the scatter matrix S = UΛU T • Select the d’ eigenvectors (columns in U), corresponding to the larges d’ eigenvalues in Λ , and store them in a new matrix U’. The resulting PCA projection becomes: y = U ′x ∀x Where, x ∈ R d , and y ∈ R d ′ 3 Chapter 3 Saurabh Prasad Pattern Recognition Electrical and Computer Engineering Department Principal Component Analysis Data distributed in feature space 1 Class I Class II 0.9 0.8 PCA 0.7 Feature II 0.6 0.5 0.4 Data projected onto PCA domain 0.1 0.3 Class I Class II 0.08 0.2 0.06 0.1 0 0.1 0.2 0.3 0.4 0.5 Feature I 0.6 0.7 0.8 0.9 1 Principal Component 2 0.04 0 0.02 0 -0.02 -0.04 -0.06 -0.08 -0.1 -0.5 4 Chapter 3 Saurabh Prasad Pattern Recognition -0.4 -0.3 -0.2 0.1 0 -0.1 Principal Component 1 0.2 0.3 0.4 0.5 Electrical and Computer Engineering Department Principal Component Analysis Data distributed in Original feature space 1 Class I Class II 0.9 0.8 0.7 Feature II 0.6 0.5 PCA 0.4 0.3 0.2 0.1 Data projected onto PCA domain 0 0.2 0 0.1 0.2 0.3 0.4 0.5 Feature I 0.6 0.7 0.8 0.9 Class I Class II 1 0.15 Principal Component 2 0.1 0.05 0 -0.05 -0.1 -0.15 -0.2 -0.8 5 Chapter 3 Saurabh Prasad Pattern Recognition -0.6 -0.4 -0.2 0 Principal Component 1 0.2 0.4 0.6 Electrical and Computer Engineering Department Linear Discriminant Analysis Data distributed in Original feature space 1 Class I Class II 0.9 0.8 0.7 Feature II 0.6 0.5 0.4 0.3 0.2 0.1 0 0 0.1 0.2 0.3 0.4 0.5 Feature I 0.6 0.7 0.8 0.9 1 LDA P(y) Maximize y Minimize 6 Chapter 3 Saurabh Prasad Pattern Recognition Electrical and Computer Engineering Department Linear Discriminant Analysis PCA will attempt to find a direction along the direction of largest variance A A B w2 w1 Lot of over lap among the projections of class A and class B onto direction w1. B No over lap among the projections of class A and class B onto direction w2. Linear discriminant analysis finds the optimum surface on which to project the features, so as to obtain maximum class separation. 7 Chapter 3 Saurabh Prasad Pattern Recognition Electrical and Computer Engineering Department Linear Discriminant Analysis 8 Chapter 3 Saurabh Prasad Pattern Recognition Electrical and Computer Engineering Department ...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online