{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

hw2_Solution - CS 6375 Machine Learning Spring 2009...

Info icon This preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
CS 6375 Machine Learning, Spring 2009 Homework 2. Total points: 50 Due: 02/10/2009 11:59pm 1. Bayes rules. [10 pts] Part of exercise 13.11 in R&N book. Suppose you are given a bag containing n unbiased coins. You are told that n -1 of these coins are normal, with heads on one side and tails on the other, whereas one coin is a fake, with heads on both sides. A. Suppose you reach into the bag, pick out a coin uniformly at random, flip it, and get a head. What is the (conditional) probability that the coin you chose is the fake coin? B. Suppose you continue flipping the coin for a total of k times after picking it and see k heads. Now that is the conditional probability that you picked the fake coin? 2. Bayes classifier and Naïve Bayes classifier. [15 pts] (A). The following data set is used to learn whether a person likes a movie or not. Major studio? Genre Win award? Like the movie no Sci-fi yes yes yes action no yes no music yes no yes action yes yes no Sci-fi no no no action no no yes Sci-fi no no yes music yes yes no music no no no Action yes no Assume you train a naïve Bayes classifier from this data set. How would it classify the following two instances? (i) major_studio=yes ^ genre=action ^ win_award=yes (ii) major_studio=yes ^ genre=action ^ win_award=no (B). Suppose now you train a Bayes classifier on this data set. How would it classify the two instances above? Please show your work. You only need to show the steps or calculations that are relevant for the classification of the given instances, you don’t need to estimate all the parameters in the model.
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon