5a-LinearClassification_4p

5a-LinearClassification_4p - Linear Models for...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Linear Models for Classification Learning a function f : X Y , with ... X < n Y = { C 1 , . . . , C k } assuming linearly separable data. 1 Linearly separable data 2 Discriminant functions y : X { C 1 , . . . , C K } Linear discriminant function Two classes: y ( x ) = w T x + w Multi classes: y k ( x ) = w T k x + w k 3 Discriminant functions x 2 x 1 w x y ( x ) k w k x w k w k y = 0 y < y > R 2 R 1 4 Multiple classes One-versus-the-rest classifier K 1 classifiers: C k vs. not- C k R 1 R 2 R 3 ? C 1 not C 1 C 2 not C 2 5 Multiple classes One-versus-one classifier K ( K 1) / 2 classifiers: C k vs. C j R 1 R 2 R 3 ? C 1 C 2 C 1 C 3 C 2 C 3 6 Multiple classes K-class discriminant comprising K linear functions y k ( x ) = w T k x + w k assigning x to C k if y k ( x ) > y j ( x ) for all j 6 = k Decision boundary between C k and C j (hyperplane in < D 1 ): ( w k w j ) T x + ( w k w j ) = 0 7 Multiple classes y k ( x ) = w T k x + w k , k = 1 , . . . , K y ( x ) = W T x with W = ( w 1 w k w K ) w k = w k w k x = 1 x 8 Three approaches to learn linear discriminant Least squares Fishers linear discriminant Perceptron 9 Least squares Given { x n , t n } , n = 1 , . . . , N , find w for the linear discriminant y ( x ) = W T x 1-of-K coding scheme for t : x C k t k = 1 , t j = 0 for all j 6 = k . E.g., t n = (0 , . . . , 1 , . . . , 0) T X = x T 1 x T N T = t T 1 t T N 10 Least squares Minimize sum-of-squares error function E ( W ) = 1 2 T r ( X W T ) T ( X W T ) Solution: W = ( X T X ) 1 X T X T y ( x ) = W T x = T T ( X ) T x 11 Issues with least squares Assume Gauusian conditional distributions. Not robust to out- liers!-4-2 2 4 6 8-8-6-4-2 2 4-4-2 2 4 6 8-8-6-4-2 2 4 12 Fishers linear discriminant Consider two classes case....
View Full Document

This note was uploaded on 09/21/2009 for the course CS 580 taught by Professor Fdfdf during the Spring '09 term at University of Toronto- Toronto.

Page1 / 12

5a-LinearClassification_4p - Linear Models for...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online