# 2 in computing the likelihood ratio the binomial

This preview shows page 1. Sign up to view the full content.

This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 6 0.12 0.32 0.24 0.16 0.08. Note that the row for Hi of the joint probability matrix is πi times the corresponding row of the likelihood matrix. Since the row sums for the likelihood matrix are one, the sum for row Hi of the joint probability matrix is πi . Therefore, the sum of all entries in the joint probability matrix is one. The joint probability matrix can be viewed as a Venn diagram. Conditional probabilities such as P (H1 |X = 2) and P (H0 |X = 2) are called a posteriori probabilities, because they are probabilities that an observer would assign to the two hypotheses after making the observation (in this case observing that X = 2). Given an observation, such as X = 2, the maximum a posteriori (MAP) decision rule chooses the hypothesis with the larger conditional P (H1 ,X 0.06 1 ,X =2) probability. By Bayes’ formula, P (H1 |X = 2) = P (HX =2) = P (H1 ,X =2)+P=2) ,X =2) = 0.06+0.16 P( (H0 That is, P (H1 |X = 2) is the top number in the column for X = 2 in the joint probability matrix divided by the sum of t...
View Full Document

## This note was uploaded on 02/09/2014 for the course ISYE 2027 taught by Professor Zahrn during the Spring '08 term at Georgia Institute of Technology.

Ask a homework question - tutors are online