Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 13
Sampling and resampling
Intelligent Data Analysis and Probabilistic Inference Lecture 13 Slide No 1
Why Sampling?
Problem: What is the chance of
winning at patience?
Analytical solution
 too difficult for mortals!
Enumerate all possibilities
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 4: Multiple Parents
Up until now our networks have been trees, but in general they need not be. In particular, we need to cope
with the possibility of multiple parents. Multiple parents can be thought of as representing different possible
causes o
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 5: Probability Propagation in Singly connected Networks
Singly Connected Networks
So far we have looked at several cases of probability
propagation in networks. We now want to integrate
the methods to produce a general purpose algorithm.
We will r
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 8: Model Accuracy
One way of dening the accuracy of a model is how well it represents a data set. This is the likelihood of the
model Bn given a data set Ds, and is written as:
P (Bn)
P (DsBn) =
data
Where P (Bn) is the joint probability of the m
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 7: Cause and Independence
Inferring Cause from Data
When determining the causal directions in a spanning tree we assumed that we knew the root nodes. This
information was provided from our knowledge of the problem, in particular the variables invo
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 10: Exact Inference
Cutset Conditioning
Pearl suggested a technique called Cutset Conditioning to deal with the problem of propagation in multiply
connected networks. The idea of this is to nd a minimal set of nodes whose instantiation will make t
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 9: Approximate Inference
Pearls probability propagation algorithm is fast and intuitively appealing since it tells us about the causal structure upon which the inference was made. However it is limited to singly connected networks. Although for
so
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 18: Support Vector Machines
The classiers that we looked at in the earlier part of the course assumed that the class boundaries were determined by a multidimensional Gaussian function. This is justied by the common occurrence of the Gaussian
dist
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 15: Principal Component Analysis
Principal Component Analysis, or simply PCA, is a statistical procedure concerned with elucidating the covariance structure of a set of variables. In particular it allows us to identify the principal directions in
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 12: Graphical Models for Inference
So far we have seen two graphical models that are used for inference  the Bayesian network and the Join
tree. These two both represent the same joint probability distribution, but in different ways. Both express
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 11: Probability Propagation in Join Trees
In the last lecture we looked at how the variables of a Bayesian network could be grouped together in cliques of
dependency, so that probability propagation could be carried out in a tree of cliques (or jo
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 14: Data Modelling
So far we have almost exclusively been considering discrete data. This has the advantage of generality. We can
represent any relationship between two variables  however complex  using a discrete link matrix. However,
the repre
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 16: Linear Discriminant Analysis
In the last lecture we viewed PCA as the process of nding a projection of the covariance matrix. This projection is a transformation of data points from one axis system to another, and is an identical process to ax
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 17: Small Sample Size Problems and Covariance Estimation
The Paramteric Bayes plugin classier.
The most commonly used method is statistical pattern recognition is the Bayes plugin classier. The usual
kernal that is plugged in is the multivariate
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 6: Building networks from data
Expert knowledge
Having established a way to make inferences with Bayesian networks, we now turn to the question of how to
obtain the correct network structure for a given application. In previous lectures we assumed
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 3: Evidence and Message Passing
In the last lecture we looked at calculating probabilities in a simple decision tree. In this lecture we will
generalise the process into a message passing algorithm. For this purpose, it is useful to introduce the
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 1: Bayes theorem and Bayesian inference
Nowadays it is common to group probability and statistics together. However the two subjects developed at
very different times. Statistics emerged as an important mathematical discipline in the nineteenth ce
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 12:
Introduction to Probabilistic Graphical Models
Intelligent Data Analysis and Probabilistic Inference Lecture 12 Slide No 1
Graphical Models
So far we have seen examples of two different types of
graphical model representing the same inference
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 11:
Propagating Probabilities in a Join Tree
Intelligent Data Analysis and Probabilistic Inference Lecture 11 Slide No 1
The story so far
Data sets with high intervariable dependency produce
Bayesian networks with lots of loops. This is bad news
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 10:
Exact Inference
Intelligent Data Analysis and Probabilistic Inference Lecture 10 Slide No 1
Problems  highly dependent data
Data
Distribution
a0,b3,c3,d1
a1,b1,c2,d3
Find most
dependent
arcs that form
a tree
Variables
A
a1,b2, c1,d2
&c.
&c.
C
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 08
Model Accuracy
Intelligent Data Analysis and Probabilistic Inference
Lecture 08
Slide 1
Joint Probability
We noted previously that:
The joint probability of any Bayesian network is the
product of the conditional probabilities given the parents
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 07
Cause and Independence
Intelligent Data Analysis and Probabilistic Inference
Lecture 07
Slide 1
Joint Probability Distributions
A data set has a joint probability
distribution:
P (ai , bj, ck ) = (No ([ai , bj , ck ])/N
And so does a network:
P
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 6
Where do Bayesian Nets come from?
Building Networks from Data
Intelligent Data Analysis and Probabilistic Inference
Lecture 6
Slide 1
Expert knowledge
In previous lectures we used the following
methodology:
1. Consult an expert to obtain the st
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 5
Probability Propagation in Singly connected
Networks
Intelligent Data Analysis and Probabilistic Inference
Lecture 5
Slide 1
Probability Propagation
We now generalise the work of
the previous lectures into a set
of ve operating equations which
c
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 4
Multiple Parents
Intelligent Data Analysis and Probabilistic Inference
Lecture 4
Slide 1
Probability Propagation in Trees
In the last lecture we saw
that probability calculations
in trees can be done locally
by multiplying the evidence
arriving
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 3
Evidence and Message Passing
Intelligent Data Analysis and Probabilistic Inference
Lecture 3
Slide 1
The story so far:
Naive Bayesian networks express Bayes theorem for
conditionally independent variables:
P (CS&D &F ) = P (C)P (SC)P (D C)P (
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Lecture 2
Simple Bayesian Networks
Intelligent Data Analysis and Probabilistic Inference
Lecture 2
Slide 1
Limitations of Simple Bayesian Inference
Simple Bayesian inference is inadequate to deal with
more complex models of prior knowledge.
Once several
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Data Analysis Tools  Assignment 1
Salim Baz  CID00618558
20160909
1
Chapter 2. Inequalities
1.1
Exercise 1
The Taylor series expansion of the concave function u(x) given by the following equation:
(x a)2 00
u (a) + .
(1)
2
Using the equation (1) set o
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 2: Simple Bayesian Networks
Simple Bayesian inference is inadequate to deal with more complex models of prior knowledge. Consider our
measure:
Catness = (Rl Rr )/Rr  + (Si 2 (Rl + Rr )/Rr 
We are currently weighting the two terms equally, but