# lec4 - C260A Lecture 4 Probabilistic Modeling Christopher...

This preview shows pages 1–10. Sign up to view the full content.

C260A Lecture 4: Probabilistic Modeling Christopher Lee October 6, 2009

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Conditional Probability p ( S | C ) = p ( S C ) p ( C ) Call S the subject and C the condition variable. 1
Draw a Venn diagram of the Monty Hall hidden ( δ ) vs. observed joint probability, in which the probability of each zone is proportional to its area in the diagram (similar to ﬁgure 1). Now derive the conditional probability of the hidden state given the observation (e.g. Pr ( δ = A | B - ) ) by circling the relevant regions of the Venn diagram that constitute this conditional probability. 2

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Rearranging Conditional Probabilities Given Pr ( S s | C c ) , rewrite with condition C moved to the subject, or subject S moved to the condition. 3
The Chain Rule p ( X 3 X 2 X 1 ) = p ( X 3 | X 2 X 1 ) p ( X 2 X 1 ) = p ( X 3 | X 2 X 1 ) p ( X 2 | X 1 ) p ( X 1 ) Generally, p ( X 1 , X 2 , X 3 ,... X n ) = n i = 1 p ( X i | X i - 1 , X i - 2 ,... X 1 ) Since intersection is symmetric, this can be applied in any of the n ! possible orders. 4

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Model = a simpler chain rule The chain rule is always true. When we state a speciﬁc model , we are usually asserting some simpliﬁcation of the joint probability, i.e. that it can be factored into simpler terms than the general chain rule. Instead of “everything depends on everything else”, the problem can be broken down into several pieces that are independent. 5
Factoring the Joint Probability If p ( O , H ) = p ( O ) p ( H ) We say O and H are independent . If p ( O , H | X ) = p ( O | X ) p ( H | X ) We say O and H are conditionally independent given X . 6

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Pr ( n , N , θ ) = ± N n θ n + 1 ( 1 - θ ) N - n 2 1 - N Are N , θ independent? Prove your answer. 7
Given a joint probability, for each factor p ( B | A ) we draw an edge in our graph A B . If A and B are conditionally independent given C , then we do not draw an edge between A and B . Their shared in- formation is fully captured by C A , C B . 8

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 04/12/2010 for the course CHEM CHEM 260A taught by Professor Chrislee during the Spring '10 term at UCLA.

### Page1 / 51

lec4 - C260A Lecture 4 Probabilistic Modeling Christopher...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online