Inference2 - Exact Inference (Last Class) variable...

Info iconThis preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon
10/25/10 Exact Inference (Last Class) variable elimination § polytrees (directed graph with at most one undirected path between any two vertices; subset of DAGs) § computing specific marginals belief propagation § polytrees § computing any marginals § polynomial time algorithm junction tree algorithm § arbitrary graphs § computing any marginals § may be exponential
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
10/25/10 Computational Complexity of Exact Inference Exponential in number of nodes in a clique need to integrate over all nodes Goal is to find a triangulation that yields the smallest maximal clique NP-hard problem →Approximate inference
Background image of page 2
10/25/10 Example Of Intractability Multiple cause model: Xi are binary hidden causes Compute What happens as number of X’s increases? X 1 X 2 X 3 X 4 X 5 X 6 Y η + + + + + + = 6 5 4 3 2 1 X X X X X X Y ) | , , , , , ( 6 5 4 3 2 1 Y X X X X X X P
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
10/25/10 Approximate Inference Exact inference algorithms exploit conditional independencies in joint probability distribution. Approximate inference exploits the law of large numbers. sums and products of many terms behave in simple ways Approximate an intractable integral/sum with samples from distribution Appropriate algorithm depends on
Background image of page 4
10/25/10 Monte Carlo Instead of obtaining p(x) analytically, sample from distribution. draw i.i.d. samples {x(i)}, i = 1 . .. N e.g., 20 coin flips with 11 heads With enough samples, you can estimate even continuous distributions empirically. This works if you can sample from p(x) directly e.g., Bernoulli, Gaussian random variables = 2245 N i i x F N dx x p x F 1 ) ( ) ( 1 ) ( ) (
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
10/25/10 What if you can’t sample from p(x) … but you can evaluate p(x)? …but you can only evaluate p(x) up to a proportionality constant?
Background image of page 6
10/25/10 Rejection Sampling Cannot sample from p(x), but can evaluate p(x) up to proportionality constant. Instead of sampling from p(x), use an easy- to-sample proposal distribution q(x). p(x) <= M q(x), M < ∞ Reject proposals with probability p(x)/ [Mq(x)]
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
10/25/10 Rejection Sampling Problem It may be difficult to find a q(x) with a small M that is easy to sample from Examples § Sample P(X1|X2=x2) from a Bayes net Sample from full joint, P(X1, X2, X3, X4, X5, …), and reject cases where X2 ≠ x2 § E(x2|x>4) where x ~ N(0,1)
Background image of page 8
10/25/10 Importance Sampling Use when § Can evaluate p(x) § An easily sampled proposal distribution
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 37

Inference2 - Exact Inference (Last Class) variable...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online