lec4 - C260A Lecture 4 Probabilistic Modeling Christopher...

Info iconThis preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon
C260A Lecture 4: Probabilistic Modeling Christopher Lee October 6, 2009
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Conditional Probability p ( S | C ) = p ( S C ) p ( C ) Call S the subject and C the condition variable. 1
Background image of page 2
Draw a Venn diagram of the Monty Hall hidden ( δ ) vs. observed joint probability, in which the probability of each zone is proportional to its area in the diagram (similar to figure 1). Now derive the conditional probability of the hidden state given the observation (e.g. Pr ( δ = A | B - ) ) by circling the relevant regions of the Venn diagram that constitute this conditional probability. 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Rearranging Conditional Probabilities Given Pr ( S s | C c ) , rewrite with condition C moved to the subject, or subject S moved to the condition. 3
Background image of page 4
The Chain Rule p ( X 3 X 2 X 1 ) = p ( X 3 | X 2 X 1 ) p ( X 2 X 1 ) = p ( X 3 | X 2 X 1 ) p ( X 2 | X 1 ) p ( X 1 ) Generally, p ( X 1 , X 2 , X 3 ,... X n ) = n i = 1 p ( X i | X i - 1 , X i - 2 ,... X 1 ) Since intersection is symmetric, this can be applied in any of the n ! possible orders. 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Model = a simpler chain rule The chain rule is always true. When we state a specific model , we are usually asserting some simplification of the joint probability, i.e. that it can be factored into simpler terms than the general chain rule. Instead of “everything depends on everything else”, the problem can be broken down into several pieces that are independent. 5
Background image of page 6
Factoring the Joint Probability If p ( O , H ) = p ( O ) p ( H ) We say O and H are independent . If p ( O , H | X ) = p ( O | X ) p ( H | X ) We say O and H are conditionally independent given X . 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Pr ( n , N , θ ) = ± N n θ n + 1 ( 1 - θ ) N - n 2 1 - N Are N , θ independent? Prove your answer. 7
Background image of page 8
Given a joint probability, for each factor p ( B | A ) we draw an edge in our graph A B . If A and B are conditionally independent given C , then we do not draw an edge between A and B . Their shared in- formation is fully captured by C A , C B . 8
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/12/2010 for the course CHEM CHEM 260A taught by Professor Chrislee during the Spring '10 term at UCLA.

Page1 / 51

lec4 - C260A Lecture 4 Probabilistic Modeling Christopher...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online