Ch8.3.1-Introduction

Ch8.3.1-Introduction - Machine Learning ! ! ! ! ! Srihari...

Info iconThis preview shows pages 1–9. Sign up to view the full content.

View Full Document Right Arrow Icon
Machine Learning Srihari 1 Inference in Graphical Models Sargur Srihari srihari@cedar.buffalo.edu
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 2 Topics 1. Objective of Inference Algorithms 2. Cases 1. Bayes Theorem Inference 2. Inference on a Chain 3. Trees 3. Factor Graphs 4. Exact Inference Algorithms for Tree graphs 1. The sum-product algorithm 2. The max-sum algorithm 5. Exact inference in general graphs 6. Approximate Inference: Loopy belief propagation 7. Learning the graph structure
Background image of page 2
Machine Learning Srihari 3 What is inference? • Graphical models represent a joint probability distribution • Types of inference tasks: 1. Compute marginal probabilities 1. Conditional probabilities can be easily computed from joint and marginals 2. Evaluate posterior distributions 1. Some of the nodes in a graph are clamped to observed values ( X) 2. Compute posterior distributions of one or more subsets of nodes (latent variables Z ), i.e., p(Z|X) 3. Compute maximum a posteriori probabilities
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 4 Inference Algorithms • Exploit graphical structure to find efficient algorithms • Algorithms are expressed as passing messages around graph • Two types of inference algorithms – Exact • We only discuss exact inference algorithms –Approximate methods • useful when there are a large number of latent variables e.g., Variational Bayes
Background image of page 4
Machine Learning Srihari Three Simple Inference Cases 1. Bayes theorem as inference 2. Inference on a chain 3. Inference on a tree 5
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 6 1. Bayes Theorem as Inference • Joint distribution p(x,y) over two variables x and y – Factors p(x,y)=p(x)p(y|x) • represented as directed graph (a) • We are given CPDs p(x) and p(y|x) • If we observe value of y as in (b) – Can view marginal p(x) as prior – Over latent variable x • Analogy to 2-class classifier x =0,1 and y is continuous – Wish to infer a posteriori distribution p(x|y)
Background image of page 6
Machine Learning Srihari • Using sum and product rules, we can evaluate marginal – Need to evaluate a summation • Which is then used in Bayes rule to calculate • Observations – Joint is now expressed as p(x,y)=p(y)p(x|y) • Which is shown in (c) – Thus knowing value of y we know distribution of x 7 Inferring posterior using Bayes
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Machine Learning Srihari 8 2. Inference on a Chain
Background image of page 8
Image of page 9
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 02/25/2012.

Page1 / 20

Ch8.3.1-Introduction - Machine Learning ! ! ! ! ! Srihari...

This preview shows document pages 1 - 9. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online