*This preview shows
pages
1–4. Sign up to
view the full content.*

This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*This ** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **Machine Learning ! ! ! ! ! Srihari Inference: Loopy Belief
Propagation
Sargur Srihari
srihari@cedar.buffalo.edu 1 Machine Learning ! ! ! ! ! Srihari Loopy belief propagation
•
• In practice exact inference may not be possible
Approaches in such cases:
1.
2. Variational methods, which are deterministic
Sampling or Monte Carlo methods
Based on stochastic numerical sampling from distributions 3. Loopy belief propagation
–
– Apply sum product algorithm even though there is no guaranty of
good results
Message passing schedule is modified
–
– Flood schedule simultaneously passes a message across every link
in both direction
Serial schedule pass one message at each time step 2 Machine Learning ! ! ! ! ! Srihari 7. Learning the graph structure
• We have assumed that the structure of the
graph is known and fixed
• It is interesting to go beyond inference and
learn the graph structure from data
• Requires defining a set of possible structures
and a measure to score each structure 3 Machine Learning ! ! ! ! ! Srihari Bayesian Learning of graph
• Compute posterior distribution over graph structures
– Make prediction by averaging with respect to this distribution • If we have prior p(m) over graphs indexed by m then posterior is
– p(m|D) ∝ p(m)p(D|m)
– Where D is the data set • Model evidence p(D|m) provides score for each model
– Evaluation of evidence involves marginalization over latent
variables
– Computationally challenging for many models • Exploring space of structures is also problemtic
– No of different graph structures grows exponentially with no of
nodes
– Necessary to use heuristics to find good candidates 4 ...

View Full
Document