Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2014
Data Analysis Tools  Assignment 1
Salim Baz  CID00618558
20160909
1
Chapter 2. Inequalities
1.1
Exercise 1
The Taylor series expansion of the concave function u(x) given by the following equation:
(x a)2 00
u (a) + .
(1)
2
Using the equation (1) set o
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 13: Sampling and resampling
Sampling provides an effective way of solving problems for which a closed form solution is not feasible,
and numeric solutions are too computationally demanding. A simple example is the problem What is the
probability
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 17: Small Sample Size Problems and Covariance Estimation
The Paramteric Bayes plugin classier.
The most commonly used method is statistical pattern recognition is the Bayes plugin classier. The usual
kernal that is plugged in is the multivariate
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 16: Linear Discriminant Analysis
In the last lecture we viewed PCA as the process of nding a projection of the covariance matrix. This projection is a transformation of data points from one axis system to another, and is an identical process to ax
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 14: Data Modelling
So far we have almost exclusively been considering discrete data. This has the advantage of generality. We can
represent any relationship between two variables  however complex  using a discrete link matrix. However,
the repre
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 11: Probability Propagation in Join Trees
In the last lecture we looked at how the variables of a Bayesian network could be grouped together in cliques of
dependency, so that probability propagation could be carried out in a tree of cliques (or jo
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 12: Graphical Models for Inference
So far we have seen two graphical models that are used for inference  the Bayesian network and the Join
tree. These two both represent the same joint probability distribution, but in different ways. Both express
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 15: Principal Component Analysis
Principal Component Analysis, or simply PCA, is a statistical procedure concerned with elucidating the covariance structure of a set of variables. In particular it allows us to identify the principal directions in
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 18: Support Vector Machines
The classiers that we looked at in the earlier part of the course assumed that the class boundaries were determined by a multidimensional Gaussian function. This is justied by the common occurrence of the Gaussian
dist
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 9: Approximate Inference
Pearls probability propagation algorithm is fast and intuitively appealing since it tells us about the causal structure upon which the inference was made. However it is limited to singly connected networks. Although for
so
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 10: Exact Inference
Cutset Conditioning
Pearl suggested a technique called Cutset Conditioning to deal with the problem of propagation in multiply
connected networks. The idea of this is to nd a minimal set of nodes whose instantiation will make t
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 7: Cause and Independence
Inferring Cause from Data
When determining the causal directions in a spanning tree we assumed that we knew the root nodes. This
information was provided from our knowledge of the problem, in particular the variables invo
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 8: Model Accuracy
One way of dening the accuracy of a model is how well it represents a data set. This is the likelihood of the
model Bn given a data set Ds, and is written as:
P (Bn)
P (DsBn) =
data
Where P (Bn) is the joint probability of the m
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 5: Probability Propagation in Singly connected Networks
Singly Connected Networks
So far we have looked at several cases of probability
propagation in networks. We now want to integrate
the methods to produce a general purpose algorithm.
We will r
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 4: Multiple Parents
Up until now our networks have been trees, but in general they need not be. In particular, we need to cope
with the possibility of multiple parents. Multiple parents can be thought of as representing different possible
causes o
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 6: Building networks from data
Expert knowledge
Having established a way to make inferences with Bayesian networks, we now turn to the question of how to
obtain the correct network structure for a given application. In previous lectures we assumed
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 3: Evidence and Message Passing
In the last lecture we looked at calculating probabilities in a simple decision tree. In this lecture we will
generalise the process into a message passing algorithm. For this purpose, it is useful to introduce the
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 1: Bayes theorem and Bayesian inference
Nowadays it is common to group probability and statistics together. However the two subjects developed at
very different times. Statistics emerged as an important mathematical discipline in the nineteenth ce
Intelligent Data Analysis and Probabilistic Inference
CO 493

Spring 2015
Lecture 2: Simple Bayesian Networks
Simple Bayesian inference is inadequate to deal with more complex models of prior knowledge. Consider our
measure:
Catness = (Rl Rr )/Rr  + (Si 2 (Rl + Rr )/Rr 
We are currently weighting the two terms equally, but
Compiler and Hardware
optimisations (for the
AMD Jaguar processor)
Fabio Luporini, Paul Kelly
07/03/2014
1
What is this presentation about?
We instantiate some of the concepts youve seen
with Paul in a real, stateoftheart architecture:
the AMD Jaguar
332
Advanced Computer Architecture
Chapter 6
Instruction Level Parallelism
 Limits and alternatives
February 2014
Paul H J Kelly
These lecture notes are partly based on the course text, Hennessy and
Patterson s Computer Architecture, a quantitative appro
332
Advanced Computer Architecture
Chapter 7
Manycore and Graphics Processors
March 2014
Paul H J Kelly
This sec(on by Anton Lokhmotov (exImperial, now with ARM), Paul H J Kelly (Imperial), with
slides from Lee Howes (Imperial PhD, n
332
Advanced Computer Architecture
Chapter 9
Theoretical computer architecture
March 2014
Paul H J Kelly
Advanced Computer Architecture Chapter 6.1
! The role of theory in computer architecture
Overview
! Computing at the end of Moores Law
! Asympotics ve
332
Advanced Computer Architecture
Chapter 8
Parallel architectures, shared memory, and
cache coherency
March 2014
Paul H J Kelly
These lecture notes are partly based on the course text, Hennessy and
Patterson s Computer Architecture, a quantitative appro
332
Advanced Computer Architecture
Chapter 5
Compiler issues: dependence analysis,
vectorisation, automatic parallelisation
February 2014
Paul H J Kelly
Advanced Computer Architecture Chapter 4.1
Background reading
The material for this part of the course
332
Advanced Computer Architecture
Chapter 3
Dynamic scheduling, outoforder execution, register
renaming and speculative execution
February 2014
Paul H J Kelly
These lecture notes are partly based on the course text, Hennessy and
Pattersons Computer Arc
The Turing Tax
Discussion exercise
Alan Turing and colleagues working on the Ferran9 Mark
I Computer in 1951. How intelligent was it? Photograph:
Science & Society Picture Library/GeNy Images
RD
RD
MUX
Sign
Extend
RD
Writ
332
Advanced Computer Architecture
Chapter 1
Introduction and review of
Pipelines, Performance, and Caches
January 2014
Paul H J Kelly
These lecture notes are partly based on the course text, Hennessy and
Pattersons Computer Architecture, a quantitative a
332
Advanced Computer Architecture
Chapter 4
Branch Prediction
February 2014
Paul H J Kelly
These lecture notes are partly based on the course text, Hennessy and
Pattersons Computer Architecture, a quantitative approach (4th ed), and on
the lecture slides
332
Advanced Computer Architecture
Chapter 2: Caches and Memory Systems
January 2014
Paul H J Kelly
These lecture notes are partly based on the course text, Hennessy and
Patterson s Computer Architecture, a quantitative approach (3rd, 4th
and 5th eds), an