Python Learning Assignment (4).pdf - Chapter 15 Machi ne...

This preview shows page 1 - 4 out of 4 pages.

Chapter 15 Machi ne Learning ...: metrics.confusion_matrix(y_test, y_test_pred) Out[66]: array([[13, 0, 0], [ 0, 12, 1], [ 0,1, 18]]) With the decision tree classifier, the resulting confusion matrix is somewhat different, corresponding to one additional misclassification in the testing dataset. Other popular classifiers that are available in scikit-learn include the nearest neighbor classifier KNeighborsClassifier from the sklearn.neighbors module, the support vector classifier (SVC) from the sklearn.svm module, and the Random
Forest classifier RandomForestClassifier from the sklearn.ensemble module. Since they all have the same usage pattern, we can programmatically apply a series of classifiers on the same problem and compare their performance (on this particular problem), for example, as a function of the training and testing sample sizes. To this end, we create a NumPy array with training size ratios, ranging from 10% to 90%:

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture