SPR_LectureHandouts_Chapter_05

# SPR_LectureHandouts_Chapter_05 - Pattern Recognition...

This preview shows pages 1–10. Sign up to view the full content.

1 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5 Pattern Recognition ECE-8443 Chapter 5: Linear Discriminant Functions (Sections 5.1-5-3) Electrical and Computer Engineering Department, Mississippi State University.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
2 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5 Introduction Linear Discriminant Functions and Decisions Surfaces Generalized Linear Discriminant Functions
3 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5 Introduction In chapter 3, the underlying probability densities were known (or given) The training sample was used to estimate the parameters of these probability densities (ML, MAP estimations) In this chapter, we only know the proper forms for the discriminant functions: similar to non-parametric techniques They may not be optimal, but they are very simple to use They provide us with linear classifiers

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
4 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5 Linear discriminant functions and decisions surfaces Definition It is a function that is a linear combination of the components of x g(x) = w t x + w 0 (1) where w is the weight vector and w 0 the bias A two-category classifier with a discriminant function of the form (1) uses the following rule: Decide ω 1 if g(x) > 0 and ω 2 if g(x) < 0 Decide ω 1 if w t x > -w 0 and ω 2 otherwise If g(x) = 0 x is assigned to either class
5 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
6 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5 The equation g(x) = 0 defines the decision surface that separates points assigned to the category ω 1 from points assigned to the category ω 2 When g(x) is linear, the decision surface is a hyperplane Algebraic measure of the distance from x to the hyperplane (interesting result!)
7 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
8 Electrical and Computer Engineering Department Saurabh Prasad Pattern Recognition Chapter 5 In conclusion, a linear discriminant function divides the feature space by a hyperplane decision surface The orientation of the surface is determined by the normal vector w and the location of the surface is determined by the bias w w H) d(0, particular in w ) x ( g r therefore w w . w and 0 g(x) ce sin ) 1 w w and x x - with colinear is (since w w w . r x x 0 2 t p p = = = = = + =