Hw2_Solution - CS 6375 Machine Learning Spring 2009 Homework 2 Total points 50 Due 11:59pm 1 Bayes rules[10 pts Part of exercise 13.11 in R&N book

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
CS 6375 Machine Learning, Spring 2009 Homework 2. Total points: 50 Due: 02/10/2009 11:59pm 1. Bayes rules. [10 pts] Part of exercise 13.11 in R&N book. Suppose you are given a bag containing n unbiased coins. You are told that n -1 of these coins are normal, with heads on one side and tails on the other, whereas one coin is a fake, with heads on both sides. A. Suppose you reach into the bag, pick out a coin uniformly at random, flip it, and get a head. What is the (conditional) probability that the coin you chose is the fake coin? B. Suppose you continue flipping the coin for a total of k times after picking it and see k heads. Now that is the conditional probability that you picked the fake coin? 2. Bayes classifier and Naïve Bayes classifier. [15 pts] (A). The following data set is used to learn whether a person likes a movie or not. Major studio? Genre Win award? Like the movie no Sci-fi yes yes yes action no yes no music yes no yes action yes yes no Sci-fi no no no action no no yes Sci-fi no no yes music yes yes no music no no no Action yes no Assume you train a naïve Bayes classifier from this data set. How would it classify the following two instances? (i) major_studio=yes ^ genre=action ^ win_award=yes (ii) major_studio=yes ^ genre=action ^ win_award=no (B). Suppose now you train a Bayes classifier on this data set. How would it classify the two instances above? Please show your work. You only need to show the steps or calculations that are relevant for the classification of the given instances, you don’t need to estimate all the parameters in the model.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
(C). There are M attributes in a data set, all binary features. You use a Naïve bayes classifier to learn the target concept (binary classification). Exactly how many distinct probability terms must be estimated from the training data to learn a Naïve Bayes classifier for this problem? Naïve Bayes classifier makes conditional independence assumptions to reduce the complexity of estimating P(target|attr_1,attr_2, …,attr_M) from the training data. If no such assumptions are made, how many distinct probability terms must be estimated from the training data? 3. Maximum Likelihood Estimation [15 pts]: A. Suppose X is a binary random variable that takes value 0 with probability p and value 1 with probability 1- p . Let X 1 , …, X n be IID samples of X. (i) Compute an MLE estimate of p . (denote it by p ˆ ). (ii) What’s the expectation of this estimate? If it is equal to p , it is called unbiased estimate; otherwise it’s biased. Is the MLE estimate unbiased? B. Let X 1 , … X n ~ uniform (0, θ ), f(x| θ )=1/ θ ( θ x 0 ). Use MLE to find θ . 4. Paper reading (10 pts) Find one paper that uses naïve Bayes classifier for an application. Summarize the paper. Please provide the paper info in your write up.
Background image of page 2
CS 6375 Homework 2 Chenxi Zeng, UTD ID: 11124236 1.
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 01/25/2012 for the course CS 6375 taught by Professor Yangliu during the Spring '09 term at University of Texas at Dallas, Richardson.

Page1 / 15

Hw2_Solution - CS 6375 Machine Learning Spring 2009 Homework 2 Total points 50 Due 11:59pm 1 Bayes rules[10 pts Part of exercise 13.11 in R&N book

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online