{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

# Lesson 29 - Module 10 Reasoning with Uncertainty...

This preview shows pages 1–6. Sign up to view the full content.

Module 10 Reasoning with Uncertainty - Probabilistic reasoning Version 1 CSE IIT, Kharagpur

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Lesson 29 A Basic Idea of Inferencing with Bayes Networks Version 1 CSE IIT, Kharagpur
10.5.5 Inferencing in Bayesian Networks 10.5.5.1 Exact Inference The basic inference problem in BNs is described as follows: Given 1. A Bayesian network BN 2. Evidence e - an instantiation of some of the variables in BN (e can be empty) 3. A query variable Q Compute P(Q|e) - the (marginal) conditional distribution over Q Given what we do know, compute distribution over what we do not. Four categories of inferencing tasks are usually encountered. 1. Diagnostic Inferences (from effects to causes) Given that John calls, what is the probability of burglary? i.e. Find P(B|J) 2. Causal Inferences (from causes to effects) Given Burglary, what is the probability that John calls, i.e. P(J|B) Mary calls, i.e. P(M|B) 3. Intercausal Inferences (between causes of a common event) Given alarm, what is the probability of burglary? i.e. P(B|A) Now given Earthquake, what is the probability of burglary? i.e. P(B|A&E) 4. Mixed Inferences (some causes and some effects known) Given John calls and no Earth quake, what is the probability of Alarm, i.e. P(A|J,~E) We will demonstrate below the inferencing procedure for BNs. As an example consider the following linear BN without any apriori evidence. Consider computing all the marginals (with no evidence). P(A) is given, and Version 1 CSE IIT, Kharagpur

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
We don't need any conditional independence assumption for this. For example, suppose A, B are binary then we have Now, (B) (the marginal distribution over B) was not given originally. . . but we just computed C were not independent of A given B, we would have a CPT for P(C|A,B) not each node has k values, and the chain has n nodes this algorithm has complexity s into ynamic programming may also be used for the problem of exact inferencing in the P it in the last step, so we’re OK (assuming we remembered to store P(B) somewhere). If P(C|B).Note that we had to wait for P(B) before P(C) was calculable. If O(nk 2 ). Summing over the joint has complexity O(k n ). omplexity can be reduced by more efficient summation by “pushing sum C products”. D above Bayes Net. The steps are as follows: . We first compute 1 Version 1 CSE IIT, Kharagpur
2. f numbers, one for each possible value of B. n use f 1 (B) to calculate f 2 (C) by summation over B (ie finding P(D)) by solving subproblems and storing inte as f 1 nary and we want (C|A = t,E = t). Computing P(C,A = t,E = t) is enough—it’s a table of numbers, one for to just renormalise it so that they add up to 1. 1 (B) is a function representable by a table of 3. Here, 4. We the This method of solving a problem the results is characteristic of dynamic programming. The above methodology may be generalized. We eliminated variables starting from the

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 15

Lesson 29 - Module 10 Reasoning with Uncertainty...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online