*This preview shows
pages
1–4. Sign up
to
view the full content.*

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **MIT OpenCourseWare http://ocw.mit.edu 14.30 Introduction to Statistical Methods in Economics Spring 2009 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms . 1 14.30 Introduction to Statistical Methods in Economics Lecture Notes 4 Konrad Menzel February 12, 2009 Bayes Theorem In the last lecture, we introduced conditional probabilities, and we saw the Law of Total Probability as a way of relating the unconditional probability P ( A ) of an event A to the conditional probabilities P ( A | B i ). Another important relationship between conditional probabilities is Bayes Law which relates the conditional probability P ( A | B ) to the conditional probability P ( B | A ), i.e. how we can revert the order of conditioning. This result plays an important role in many areas of statistics and probability, most importantly in situations in which we learn about the state of the world A from observing the data B . Example 1 The ancient Greeks (who apparently didnt know much statistics yet) noticed that each time after a ship had sunk, all surviving seamen reported having prayed to Poseidon, the Greek god of the sea. From this observation, they inferred that they were in fact saved from drowning because they had prayed. This example was actually brought up by the English philosopher Francis Bacon in the 16th century. In statistical terms, lets define the events A = survives and B = prayed, so that the question becomes whether praying increases the odds of survival, i.e. whether P ( A | B ) > P ( A ) p , say. The observation that all surviving seaman had been praying translates to P ( B | A ) = 1 . Is that information actually sucient to answer the question whether praying strictly increases the chances of survival? How do we use the information on P ( B | A ) to learn about P ( A | B ) ? From the definition of conditional probabilities, we obtain P ( AB ) = P ( A | B ) P ( B ) = P ( B | A ) P ( A ) Rearranging the second equality, we get P ( B | A ) P ( A ) P ( A | B ) = P ( B ) Weve also seen that we can partition the event P ( B ) = P ( B | A ) P ( A ) + P ( B | A C ) P ( A C ) so that P ( B | A ) P ( A ) P ( A | B ) = P ( B | A ) P ( A ) + P ( B | A C ) P ( A C ) We can generalize this to any partition of S as summarized in the following theorem: 1 Theorem 1 (Bayes Theorem) If A 1 ,A 2 ,... is a partition of S , for any event B with P ( B ) > 0 we can write P ( B | A i ) P ( A i ) P ( B | A i ) P ( A i ) P ( A i | B ) = = P ( B ) j 1 P ( B | A j ) P ( A j ) P ( A i ) is the prior probability of an event A i (i.e. probability before experiment is run) P ( A i | B ) is the posterior probability of A i (i.e. the probability after we ran the experiment and got information B- as obtained from Bayes theorem) An entire statistical theory of optimal decisions is built on this simple idea: Bayesian Decision Theory....

View
Full
Document