Chap4-Part1 - Linear Models for Classification Discriminant...

Info iconThis preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Linear Models for Classification Discriminant Functions Sargur N. Srihari University at Buffalo, State University of New York USA
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Topics • Linear Discriminant Functions – Definition (2-class), Geometry – Generalization to K > 2 classes • Methods to learn parameters 1. Least Squares Classification 2. Fisher’s Linear Discriminant 3. Perceptrons 2 Machine Learning Srihari
Background image of page 2
Discriminant Function • Assigns input vector x to one of K classes denoted by C k • Restrict attention to linear discriminants – Decision surfaces are hyperplanes • First consider K = 2 , and then extend to K > 2 3 Machine Learning Srihari
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Geometry of Linear Discriminant Functions: • Two-class linear disc fn: y (x) = w T x + w 0 w is wt vector and w 0 is bias Assign x to C 1 if y (x) 0 else C 2 • Defines boundary as y (x) = 0 • w determines orientation of surface since w T (x A -x B )=0, w is orthogonal to every vector on surface w 0 sets distance of origin to surface w T x/||w|| = w 0 /||w|| y (x) gives signed measure of perpendicular distance r of point x to decision surface , r =y( x )/|| w || • With dummy input x 0 =1 and ω =( w 0 ,w) then y (x) = ω T x – passes thru origin in augmented D+1 dim. space
Background image of page 4
Multiple Classes with 2-class classifiers • By using several 2-class classifiers 5 Build a K class discriminant Use K 1 classifiers, each solve a two-class problem Alternative is K(K 1)/2 binary discriminant functions, one for every pair One-versus-the-rest One-versus-one Both result in ambiguous regions of input space Machine Learning Srihari
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Multiple Classes with K discriminants 6 Consider a single K class discriminant of the form y k (x) = w T k x + w k0 Assign a point x to class C k if y k (x) > y j (x) for all j k Decision boundary between class C k and C j is given by y k (x) = y j (x) – This corresponds to D 1 dimensional hyperplane defined by – (w k w j ) T x + ( w k 0 w j0 ) = 0 – Same form as the decision boundary for 2-class case w T x + w 0 =0 Decision regions of such a discriminant are always singly connected and convex – Proof follows Machine Learning Srihari
Background image of page 6
7 Thus R k is singly-connected and convex ˆ x = λ x A + (1 − λ ) x B Consider two points x A and x B both in decision region R k Any point on line connecting x A and x B can be expressed as From linearity of discriminant functions y k (x) = w T k x + w k0 y k ( ˆ x ) = y k (x A ) + (1 − λ ) y k (x B ) Because x A and x B lie inside R k it follows that y k (x A ) > y j (x A ) and y k (x B ) > y j x
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 02/25/2012.

Page1 / 28

Chap4-Part1 - Linear Models for Classification Discriminant...

This preview shows document pages 1 - 8. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online