368809689 - 1 2 g ( x ) = w x + wo T x1 x 2 x= xd w1 w 2 w=...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 2 g ( x ) = w x + wo T x1 x 2 x= xd w1 w 2 w= wd 3 g ( x) = g1 ( x) - g 2 ( x) g ( x) > 0, x 1 g ( x) < 0, x 2 g ( x) = 0, x . g ( x) = 0 4 x1 , x2H w x1 + w0 = w x 2 + w0 T T w ( x1 - x 2 ) = 0 T 5 w w x = xp + r w 6 w ) + w0 g ( x) = w ( x p + r w T w w = w x p + w0 + r =r w w T T g ( x) r= w 7 : figure 4.2 g ( x ) = ( x - a )( x - b) g ( x ) = c0 + c1 x + c2 x = a y 2 T y1 1 y = x y = 2 2 y3 x a1 c0 a = c a = 2 1 a 3 c 2 8 9 g ( x) = w0 + w x = a y T T 1 w0 y = ,a = x w augmented sample vector 10 The Curse of Dimensionality 11 12 13 Dr=1 r=1 14 15 r 16 17 Fisher d N : x1 , , xN 1 : N1, 2 : N 2 N1 + N 2 = N 18 19 y n = w x n , n = 1,2, T , N i , i = 1,2 i = 1,2 1 mi = Ni x x, i 20 Si = x ( x - m )( x - m ) i i i T , i = 1,2 S i : S w = S1 + S 2 : S b = (m1 - m2 )(m1 - m2 ) T 21 ~ = 1 mi Ni yYi y, yYi i = 1,2 i = 1,2 ~2 ~ )2 , Si = ( y - m i ~ ~2 ~2 S w = S1 + S 2 ~ - m )2 ~ (m1 2 J F ( w) = ~ 2 ~ 2 S1 + S 2 22 ~ = 1 mi Ni 1 =w ( Ni T 1 y = N yYi i T x x w i T x x) = w m i i ~ - m ) 2 = ( wT m - wT m ) 2 ~ (m1 2 1 2 = w (m1 - m2 )(m1 - m2 ) w = w S b w T T T 23 ~2 Si = ~ )2 = ( y - mi yYi x (w i T x - w mi ) T 2 = w [ ( x - m i )( x - m i ) ]w T T x i = w Si w T 24 w Sb w J F ( w) = T w Sww T w S w w = c 0 T L = w S b w - ( w S w w - c) T T 25 L = S b w - S w w = 0 w Sb w = S w w S S b w = w -1 w 26 S b w = (m1 - m2 )(m1 - m2 ) w T = (m1 - m2 ) R w = S ( Sb w ) = S (m1 - m2 ) R -1 w -1 w w = S ( m1 - m2 ) 27 -1 w : y (1) 0 ~ ~ m1 + m2 = 2 ~ ~ N 2 m1 + N1m2 = N1 + N 2 ( y02 ) ( y03) ~ ~ m1 + m2 ln( P( w1 ) / P( w2 )) = + 2 N1 + N 2 - 2 y > y0 x w1 y < y0 x w2 28 : ,Bayes. 1. , . 2. 29 1. 2. 2. 3. 4. 5. Fisher? ? ? ? . ? ? ? ? 30 Fisher- 31 Fisher- : Fisher, ( ,) 32 Fisher ... 33 Fisher Fisher: Kernel Fisher Fisher Hastie T and Tibshirani R. Discriminant adaptive nearest neighbor classification. IEEE Trans. On PAMI, 1996, 18(6):409-415 NIPS ICML 34 a y T a T y i > 0, y i 1 T a y i < 0, y i 2 y i , y i 1 y = - y i , y i 2 ' n a y >0 T ' n 35 36 a , a y n > 0, n = 1,2, T , N. 37 J P (a) = k yY k (-a T y) Y J P (a) J P (a) = = a yY k (- y ) a (k + 1) = a (k ) - k J a(k + 1) = a(k ) + k yY y k 38 Algorithm Step 1: initialize Step 2: calculate a (0), k , t = 0 J P (a) J P (a) = = a yY k (- y ) yY k Step 3: Update a (k + 1) = a (k ) + k Step 4: if vector does not change, stop. else goto Step 2 y 39 Single Sample Correction Algorithm y1 , y2 ,..., yn , y1 , y2 ,..., yn ,...... a(k + 1) = a(k ) + y T k T k a (k + 1) y = a (k ) y + y y k kT k 40 c-1 c(c - 1) 2 41 42 43 44 45 46 47 ...
View Full Document

This note was uploaded on 06/02/2010 for the course ELECTRONIC PC2010S taught by Professor Zhangchangshui during the Spring '10 term at Tsinghua University.

Ask a homework question - tutors are online