{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

5a-LinearClassification_4p

5a-LinearClassification_4p - Linear Models for...

Info icon This preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
Linear Models for Classification Learning a function f : X Y , with ... X n Y = { C 1 , . . . , C k } assuming linearly separable data. 1 Linearly separable data 2 Discriminant functions y : X → { C 1 , . . . , C K } Linear discriminant function Two classes: y ( x ) = w T x + w 0 Multi classes: y k ( x ) = w T k x + w k 0 3 Discriminant functions x 2 x 1 w x y ( x ) w x w 0 w y = 0 y < 0 y > 0 R 2 R 1 4
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Multiple classes One-versus-the-rest classifier K 1 classifiers: C k vs. not- C k R 1 R 2 R 3 ? C 1 not C 1 C 2 not C 2 5 Multiple classes One-versus-one classifier K ( K 1) / 2 classifiers: C k vs. C j R 1 R 2 R 3 ? C 1 C 2 C 1 C 3 C 2 C 3 6 Multiple classes K -class discriminant comprising K linear functions y k ( x ) = w T k x + w k 0 assigning x to C k if y k ( x ) > y j ( x ) for all j = k Decision boundary between C k and C j (hyperplane in D 1 ): ( w k w j ) T x + ( w k 0 w j 0 ) = 0 7 Multiple classes y k ( x ) = w T k x + w k 0 , k = 1 , . . . , K y ( x ) = ˜ W T ˜ x with ˜ W = (˜ w 1 · · · ˜ w k · · · ˜ w K ) ˜ w k = w k 0 w k ˜ x = 1 x 8
Image of page 2
Three approaches to learn linear discriminant Least squares Fisher’s linear discriminant Perceptron 9 Least squares Given { x n , t n } , n = 1 , . . . , N , find ˜ w for the linear discriminant y ( x ) = ˜ W T ˜ x 1-of-K coding scheme for t : x C k t k = 1 , t j = 0 for all j = k . E.g., t n = (0 , . . . , 1 , . . . , 0) T ˜ X = ˜ x T 1 · · · ˜ x T N T = t T 1 · · · t T N 10 Least squares Minimize sum-of-squares error function E ( ˜ W ) = 1 2 Tr ( ˜ X ˜ W T ) T ( ˜ X ˜ W T ) Solution: ˜ W = ( ˜ X T ˜ X ) 1 ˜ X T ˜ X T y ( x ) = ˜ W T ˜ x = T T ( ˜ X ) T ˜ x 11 Issues with least squares Assume Gauusian conditional distributions. Not robust to out- liers! -4 -2 0 2 4 6 8 -8 -6 -4 -2 0 2 4 -4 -2 0 2 4 6 8 -8 -6 -4 -2 0 2 4 12
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Fisher’s linear discriminant Consider two classes case. Determine y = w T x and classify x C 1 if y ≥ − w 0 , x C 2 otherwise. Corresponding to the projection on a line determined by w .
Image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern