Assignment#4 Solutions (Chapter 5)
4. Consider a training set that contains 100 positive examples and 400 negative examples. For each of the following candidate rules, R1 : A + (covers 4 positive and 1 negative examples), R2 : B + (covers 30 positive and

Assignment#3 Solutions (Chapter 4)
7. The following table summarizes a data set with three attributes A, B, C and two class labels +, . Build a two-level decision tree.
(a) According to the classification error rate, which attribute would be chosen as the

Assignment#2 Solutions (Chapter 4) 3. Consider the training examples shown in Table 4.8 for a binary classification problem. a) W hat is the entropy of this collection of training examples with respect to the positive class? Answer: There are four positiv

Assignment #1 Solutions
13. Consider the problem of finding the K nearest neighbors of a data object. A programmer designs Algorithm 2.1 for this task.
Algorithm 2.1 Algorithm for finding K nearest neighbors. for i = 1 to number of data objects do Find th