This preview shows pages 1–11. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: CSE 6740 Lecture 22 How Do I Evaluate DeeplyNested Sums? (Graphical Model Inference) Alexander Gray agray@cc.gatech.edu Georgia Institute of Technology CSE 6740 Lecture 22 p. 1/4 7 Today 1. Graphical Model Computations 2. Exact Inference Algorithms 3. Approximate Inference Algorithms CSE 6740 Lecture 22 p. 2/4 7 Graphical Model Computations What we want to compute with graphical models. CSE 6740 Lecture 22 p. 3/4 7 Sprinkler Example Recall the sprinkler example: CSE 6740 Lecture 22 p. 4/4 7 Conditional Probability Inference Inference is a general statistical term. In the context of graphical models it means finding out the probability distribution for a variable of interest given that some of the other variables have fixed values. For example, suppose we observe the fact that the grass is wet. There are two possible causes for this: either it is raining, or the sprinkler is on. Which is more likely? CSE 6740 Lecture 22 p. 5/4 7 Conditional Probability Inference We can use Bayes rule to compute the posterior probability of each explanation. P ( S = 1  W = 1) = P ( S = 1 ,W = 1) P ( W = 1) (1) = c,r P ( C = c,S = 1 ,R = r,W = 1) P ( W = 1) = 0 . 43 (2) P ( R = 1  W = 1) = P ( R = 1 ,W = 1) P ( W = 1) (3) = c,s P ( C = c,S = s,R = 1 ,W = 1) P ( W = 1) = 0 . 71 (4) CSE 6740 Lecture 22 p. 6/4 7 Conditional Probability Inference where P ( W = 1) = summationdisplay c,r,s P ( C = c,S = s,R = r,W = 1) = 0 . 65 (5) is the normalizing constant. So we see that it is more likely that the grass is wet because it is raining. CSE 6740 Lecture 22 p. 7/4 7 Graphical Model Computations Let ( E,Q ) be a partitioning of the variables (evidence and query variables). Two main quantities we desire are: Marginal probabilities: P ( Q = q ) = summationdisplay e P ( Q = q,E = e ) (6) or P ( E = e ) = q P ( Q = q,E = e ) . Maximum a posteriori (MAP) probabilities: P * ( Q = q ) = max e P ( Q = q,E = e ) . (7) CSE 6740 Lecture 22 p. 8/4 7 Graphical Model Computations From these basic quantities we can obtain other quantities such as conditional probabilities : P ( Q = q  E = e ) = P ( Q = q,E = e ) P ( E = e ) = P ( Q = q,E = e ) q P ( Q = q,E = e ) . (8) In general multiple marginalizations (summations) must be performed. For example if ( E,Q,H ) (evidence, query, and hidden) is a partitioning of the variables, P ( Q = q  E = e ) = P ( Q = q,E = e ) q P ( Q = q,E = e ) (9) = h P ( Q = q,E = e,H = h ) q h P ( Q = q,E = e,H = h ) . (10) CSE 6740 Lecture 22 p. 9/4 7 Graphical Model Computations For a directed graph, we can represent the joint probability P ( X 1 ,... ,X D ) = productdisplay i P ( X i  ( X i )) (11) where ( X ) denotes the set of parents of X and P ( X i  ( X i )) is the local conditional probability associated with node X i ....
View Full
Document
 Fall '08
 Staff

Click to edit the document details