Probabilistic Modelling and Reasoning: Assignment 2
School of Informatics, University of Edinburgh
Instructor: Dr Chris Williams
Handed out 22 November 2005. Due in 9 December 2005 by 4pm
Remember that plagiarism is a university oence. Please read the pol

Bayesian Reasoning and Machine Learning
David Barber c 2007,2008,2009,2010,2011,2012,2013
Notation List
V
a calligraphic symbol typically denotes a set of random variables . . . . . . . . 7
dom(x)
Domain of a variable . . . . . . . . . . . . . . . . . . .

Probabilistic Modelling and Reasoning
Tutorial Sheet 6
School of Informatics, University of Edinburgh
Instructor: Amos Storkey
Based on Questions by Chris Williams
March 7, 2014
1. A Hidden Markov Model problem.
Consider a HMM with 3 states (M = 3) and 2

Probabilistic Modelling and Reasoning, Tutorial Question Sheet 5
School of Informatics, University of Edinburgh
Instructor: Amos Storkey
February 2014
A Boltzmann Machine takes the form
P (x|W, a, b) =
1
exp
Z
1T
x Wx + bT x
2
T
T
1. Show that the probabi

Probabilistic Modelling and Reasoning
Tutorial Sheet 7
School of Informatics, University of Edinburgh
Instructor:Amos Storkey
Some questions by Chris Williams
February 2014
1. Use the Kalman ltering equations given in Barbers book (see Algorithm 24.1) to

Probabilistic Modelling and Reasoning
Assignment
Instructor: Prof Amos Storkey
Published: Monday, February 10th, 2014
Due: Monday, March 3rd, 2014 by 4pm
Remember that plagiarism is a university oence. Please read the policy at
http:/www.inf.ed.ac.uk/teac

Short course on adaptive modelling: Lab session
prepared for 2012/13 class by: Dino Sejdinovic
previous versions by: John Shawe-Taylor, Tom Diethe
May 14, 2013
Abstract
Using a series of examples, in this exercise session you will familiarise
yourselves w

Overview
Hidden Markov Models
Denitions
Inference Problems
Recursion formulae
Chris Williams
Viterbi alignment
School of Informatics, University of Edinburgh
Training a HMM
Reading: Bishop 13.1, 13.2 (but not 13.2.3, 13.2.4,
13.2.5), Rabiner paper
Novembe

Overview
Bayesian Model Selection
Bayesian Learning of CPTs
Dealing with Multiple Models
Chris Williams
Other Scores for Model Comparison
Searching over Belief Network structures
School of Informatics, University of Edinburgh
Readings: Bishop 3.4, Heckerm

PMR Comments
Probabilistic Modelling and Reasoning
Amos Storkey
School of Informatics, University of Edinburgh
Amos Storkey PMR Comments
1/8
General comments and questions
Use this document for comments about items that do not
relate directly to a particu

Probabilistic Modelling and Reasoning, Tutorial Question Sheet 1
(for Week 3)
School of Informatics, University of Edinburgh
Instructor: Amos Storkey
Tutorial sheet based on questions by Chris Williams
February 2014
1. [Maximum likelihood estimation]. The

Probabilistic Modelling and Reasoning
Tutorial Sheet 3
School of Informatics, University of Edinburgh
Instructor:Amos Storkey
Questions by Chris Williams
February 2014
1. [Bishop qu 2.7]. Consider the Bernoulli random variable X with mean , and suppose
we

Probabilistic Modelling and Reasoning: Assignment 2
School of Informatics, University of Edinburgh
Instructor: Prof Chris Williams
Published 17 November 2008. Due in 15 December 2008 by 4pm
Remember that plagiarism is a university oence. Please read the p

Probabilistic Modelling and Reasoning: Assignment 2
Instructor: Prof Chris Williams
Published 13 November 2009. Due in Mon 14 December 2009 by 4pm
Remember that plagiarism is a university oence. Please read the policy at
http:/www.inf.ed.ac.uk/teaching/pl

Probabilistic Modelling and Reasoning
Assignment 2
Instructor: Dr. Chris Williams
Published: Tues November 23, 2010
Due: Monday, December 13, 2010 by 4pm
Remember that plagiarism is a university oence. Please read the policy at
http:/www.inf.ed.ac.uk/teac

Answers to PMR tutorial questions (Sheet 3)
Kian Ming Adam Chai and Chris Williams
February 4, 2014
1.
(Binomial likelihood)
P (nh , nt |) = nh (1 )nt
nh
(Maximum likelihood solution, see last weeks tutorial)
ML =
nh + nt
p() h 1 (1 )t 1
(Beta prior)
h
(P

Probabilistic Modelling and Reasoning, Tutorial Question Sheet 5
School of Informatics, University of Edinburgh
Instructor: Amos Storkey
February 2014
This tutorial is too long for one tutorial. Also with the assignment, (almost)
noone will have done it.

Answers to PMR tutorial questions (Number 6)
March 7, 2014
Q11(i).
z1
z3
x1
P (X) =
z2
x2
x3
(zn ) (zn ) =
zn cfw_1,2,3
(z3 ) (z3 )
(choose n=3)
z3 cfw_1,2,3
We have that (z3 ) = 1 for all values of z3 ; we just need to calculate (z3 ) using the recursion

Probabilistic Modelling and Reasoning, Tutorial
Question Sheet 2 (for Week 4)
School of Informatics, University of Edinburgh
Instructor: Amos Storkey
October 2013
1. Consider the belief network given below, which concerns the probability of a car
starting

Probabilistic Modelling and Reasoning, Tutorial Question Sheet 1
(for Week 3)
School of Informatics, University of Edinburgh
Instructor: Amos Storkey
Tutorial sheet by Chris Williams
September 2013
1. [From Tipping, 2.1.3]. Box 1 contains 8 apples and 4 o

Probabilistic Modelling and Reasoning: Assignment 2
School of Informatics, University of Edinburgh
Instructor: Dr. Chris Williams
Published 13 November 2007. Due in 30 November 2007 by 4pm
Remember that plagiarism is a university oence. Please read the po

Questions
Example
In a large online gaming site, how do we match
players with similar skill levels?
Slightly harder. Who will beat who in a basketball
league?
See www.kaggle.com for the above challenge
Good answers to both of these use belief
networks

The Matrix Cookbook
[ http:/matrixcookbook.com ]
Kaare Brandt Petersen
Michael Syskind Pedersen
Version: November 15, 2012
1
Introduction
What is this? These pages are a collection of facts (identities, approximations, inequalities, relations, .) about ma

Probabilistic Modelling and Reasoning
Reference Solution for Assignment 2014
Instructor: Prof. Amos Storkey
March 12, 2014
1. Inference in a Belief Network
(a)
Joint distribution (8 marks)
P (D, C, S, J, I, V, T ) = P (D)P (C |D)P (S )P (J |D)P (I |C )P (

Answers to PMR tutorial questions (Sheet 3)
Kian Ming Adam Chai and Chris Williams
February 14, 2014
This sheet is relatively short. There may be some residual questions from last weeks tutorial. There may be further questions
on the course material or on

%Naive Bayes classifier - text example
%last update: February 2013
%Xtrn - training data
%ytrn - labels for training data
%Xtst - testing data
%ytst - correct labels for testing data
function [ypred] = naive_bayes(Xtrn, ytrn, Xtst)
%the vector of unique c

%Classification examples with mixtures of gaussians
%DS (February 2013)
% generate data
clear all
close all
%means of the positive examples
mux1=[-1; -1];
mux2=[1; 1];
%means of the negative examples
muy1=[-1; 1];
muy2=[1; -1];
%all componentsw in the mix

%Naive Bayes classifier - gaussian fit along each dimension
%last update: February 2013
%Xtrn - training data
%ytrn - labels for training data
%Xtst - testing data
%ytst - correct labels for testing data
function [ypred] = naive_bayes_gaussianfit(Xtrn, yt

%Classification examples with mixtures of gaussians
%DS (February 2013)
% generate data
clear all
close all
%means of the positive examples
mux1=[1; -1];
%means of the negative examples
muy1=[-1; 1];
%covariance defined below via eigendecomposition C=U*in