CS 6375 Machine Learning, Spring 2009
Homework 2. Total points: 50
Due: 02/10/2009 11:59pm
Note: The following answers may not have all the details. Let me know if you have any
questions.
1. Bayes rules. [10 pts]
Part of exercise 13.11 in R&N book.
Suppose you are given a bag containing
n
unbiased coins. You are told that
n
1 of these
coins are normal, with heads on one side and tails on the other, whereas one coin is a fake,
with heads on both sides.
a.
Suppose you reach into the bag, pick out a coin uniformly at random, flip it, and
get a head. What is the (conditional) probability that the coin you chose is the fake
coin?
b.
Suppose you continue flipping the coin for a total of
k
times after picking it and
see
k
heads. Now that is the conditional probability that you picked the fake coin?
Solution:
a)
1
2
1
1
2
1
1
)

(
)
(
)

(
)
(
)

(
)
(
)
(
)
,
(
)

(
+
=
+
−
×
=
+
=
=
n
n
n
n
n
normal
head
p
normal
p
fake
head
p
fake
p
fake
head
p
fake
p
head
P
head
fake
P
head
fake
P
b
）
k
k
k
n
n
n
n
n
normal
heads
k
p
normal
p
fake
heads
k
p
fake
p
fake
heads
k
p
fake
p
heads
k
P
heads
k
fake
P
heads
k
fake
P
2
)
1
(
2
1
1
2
1
1
)

(
)
(
)

(
)
(
)

(
)
(
)
(
)
,
(
)

(
+
−
=
+
−
×
+
=
=
2. Bayes classifier and Naïve Bayes classifier. [15 pts]
(A). The following data set is used to learn whether a person likes a movie or not.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentMajor studio?
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '09
 yangliu
 Probability, Machine Learning, Probability theory, Maximum likelihood, Bayes classifier

Click to edit the document details