AE05.pdf

# The second classifier linear classifier or multiple

• No School
• AA 1
• 39

This preview shows page 21 - 23 out of 39 pages.

The second classifier, linear classifier or multiple-class linear classifier, is used solely to solve linearly separable problems such as the one in Fig. 16. In Fig. 16, the three classes C 1 , C 2 and C 3 are linearly separable using three linear decision surfaces d 1 , d 2 and d 3 (each one separating one class from the remaining two). The equations of the linear classifier for the generalized two problem classes C 1 and C 2 are: (6) where i = 1, 2, …, N 1 ; and: (7) where d is the dimensionality of the pattern vector; j = 1, 2, …, N 2 ; N i and N j are the number of training examples from the classes C 1 and C 2 respectively; and w i is a set of weights to be calculated during training, where i = 0, 1, 2, …. Training consists of using known examples from each class (from the training set) to estimate the values of weights w 0 , w 1 , w 2 , …, w d , where d is the dimensionality of d X w w xj w xj w xj j d d – ... – > ( ) = 0 1 1 2 2 0 d X w w xi w xi w xi i d d + + – ... – > ( ) = 0 1 1 2 2 0 162 Acoustic Emission Testing F IGURE 15. Basic principle of k nearest neighbors classifier and respective piecewise linear decision surface for the exclusive or (XOR) problem. x 1 C 1 C 2 C 3 C 4 x 2 Legend C 1 , C 2 , C 3 , C 4 = training classes x 1 , x 2 = indices F IGURE 16. Basics of linear classifier and respective linear decision surfaces. + + + C 1 C 2 C 3 x 2 d 1 ( X ) = 0 d 3 ( X ) = 0 d 2 ( X ) = 0 x 1 Legend C 1 , C 2 , C 3 = training classes d 1 , d 2 , d 3 = linear decision surfaces X = unknown pattern vector x 1 , x 2 = indices

Subscribe to view the full document.

the feature space. In other words, weight values are estimated by solving the set of ( N 1 + N 2 ) inequalities. Usually, the solution is through an iterative process. 1-6 Several generalizations to multiple-class problems exist, 1-6 among which one C linear discriminant function exists for each class. In this case, upon training and estimation of weights for all C discriminant functions, the unknown pattern X is classified to the class scoring the highest value among the C discriminant functions. 1-6 No matter which classification scheme is used, the data composing the training set should vary enough to simulate the expected variability of the real problem. Once the training phase is completed, the key questions are what the error rate is and how accurate the estimate of it is. One way to estimate the error rate is to evaluate how well the classifier sorts the examples of a testing set containing data of previously known classification. In the worst case, where data are limited, the test set might be identical with the training set. This test set would be the smallest possible: if the classifier cannot correctly classify the examples used to train it, then it is not trained. In most cases, a common test procedure is to divide the available data into two subsets — one to train the classifier and the other to test it. The proportion of data used to train and test the classifier also needs to be considered.

{[ snackBarMessage ]}

### What students are saying

• As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

Kiran Temple University Fox School of Business ‘17, Course Hero Intern

• I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

Dana University of Pennsylvania ‘17, Course Hero Intern

• The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

Jill Tulane University ‘16, Course Hero Intern