This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Bayesian inference Fredrik Ronquist October 5, 2005 1 Introduction The last few decades has seen a growing interest in Bayesian inference, an alternative approach to statistical inference. There are many similarities between maximum likelihood and Bayesian inference but also important differences. Both methods use the same stochastic models of molecular evolution but the underlying inference principles are different, as we will explore in this lecture. 2 A probability exercise It may be helpful to start with a simple exercise in probability theory. Assume we have 100 balls. The balls are either white or black and either small or large. Thus, there are four kinds of balls: (1) white and small; (2) white and large; (3) black and small; and (4) black and large. There are 10 balls of the first type, 15 of the second, 50 of the third and 25 of the fourth (Table 1). Table 1: A collection of balls Color Small Large Total White 10 15 25 Black 50 25 75 Total 60 40 100 Now, let us calculate the joint probability of a ball picked randomly from this set being small and white, P (small , white). If the probabilities of being small or large and white or black had 1 BSC5936Fall 2005PB,FR Computational Evolutionary Biology been independent, we could have calculated the joint probability simply as the product of the probabilities of each of the two events, that is P (small , white) = P (small) P (white) Clearly, this is not the case for our collection of balls. An alternative approach we can use then, is to obtain the desired probability as the product of the probability of the ball being small times the probability of it being white given that it is small. In equation form we could write P (white , small) = P (small) P (white  small) If we plugged numbers into the equation, we would get P (white , small) = P (small) P (white  small) = (60 / 100)(10 / 60) = 10 / 100. The joint probability of the ball being small and white can also be calculated by multiplying the probability of the ball being white with the probability of it being small given that it is white. This equation would be P (white , small) = P (white) P (small  white) Plugging numbers into the equation, we would get P (white , small) = P (white) P (small  white) = (25 / 100)(10 / 25) = 10 / 100, the same result as obtained previously. Since both methods give the same joint probability, we can equate them, in which case we get P (small) P (white  small) = P (white) P (small  white) After moving one factor over to the right side of the equation, we get P (white  small) = P (white) P (small  white) P (small) Replacing white and small with two general events labeled A and B, we get the general probability statement P ( A  B ) = P ( A ) P ( B  A ) P ( B ) 3 Bayes’ theorem The simple probability statement derived above is known as Bayes’ rule or Bayes’ theorem. When it is used in statistical inference, it is applied to data, D , and a set of parameter values θ of the 2 BSC5936Fall 2005PB,FR...
View
Full
Document
This note was uploaded on 11/27/2011 for the course BSC 5936 taught by Professor Staff during the Spring '08 term at FSU.
 Spring '08
 staff
 RNA, Evolution

Click to edit the document details