Lets go back to the cards question on slide 6 of Lecture notes 5.
STEP 1: DEFINE EVENTS
Let E be the event that the first card is a club.
Let F be the event that the first card is not a club.
Let A be the event that the second card is a club.
STEP 2: CHOO

3. Conditional Probability
Conditional probability concerns the calculation of probabilities, given (conditional on) some partial information.
Example 1
(i) Throw a die you are told that the number is even.
Let A be the event that the number is even,
B be

8
8.1
Normal Distribution
Introduction
The normal distribution is probably the most important probability distribution playing a key role in
both probability and statistics. We will begin by describing the normal distribution and we will then go
on to dem

6
6.1
Standard Discrete Distributions
Bernoulli Distribution
This is a random variable with only two possible outcomes, 1 - success and 0 - failure.
Random variable.
Probability mass function (pmf).
P (Success) =
p
P (Failure) =
1p
1
X=
0
with probabili

9. Chebychevs Inequality and the weak law of large numbers
Chebychevs Inequality tells us something about how likely it is
that any random variable X will take a value far away from its
average value, . We can use it even if we dont know the full
distribu

9. Chebyshevs Inequality and the weak law of large numbers
Chebyshevs Inequality tells us something about how likely it is
that any random variable X will take a value far away from its
expectation, which throughout this chapter will be denoted by
. We ca

5
Expectation and Variance
5.1
Introduction
Expectation and variance are the two most important examples of summary statistics for a random
variable.
Expectation, or mean a measure of the location of the distribution.
Other measures of location are the m

4
Random Variables
Often we are not interested in all events in a sample space, but some subset of events whose members
each have a natural association with a real number. This motivates the idea of random variables.
Definition 4.1.1
A random variable, X,

8
8.1
Normal Distribution
Introduction
The normal distribution is the most important probability distribution, playing a key role in both probability and statistics. It was discovered by Abraham De Moivre in 1733 as the limiting distribution in
the normal

7
Continuous Random variables
7.1
Introduction
Example 7.1.1
Spin a pointer and measure the angle clockwise from 12 oclock. Let X denote the resulting angle. X
is a random variable with range RX = cfw_0 x < 2.
Questions: How would you find the following p

1
1.1
Basic Concepts
Experiments, Sample spaces and Events
We begin with an experiment. This may involve doing something or just observing something. The
experiment gives rise to a sample space, the set of possible outcomes. We sometimes make simplifying

6
6.1
Standard Discrete Distributions
Bernoulli Distribution
This is a distribution whose range is cfw_0, 1, where 1 and 0 are often associated with success and failure
respectively. That is
We have
P (Success) =
p
P (Failure) =
1p
1 with probability p
Y

7. Continuous Distributions
Example 7.1.1
Spin a pointer and measure the angle clockwise from
12 oclock. Let denote the resulting angle. Measure
in radians, so that 0 < 2.
Example: Find the following probabilities:
1. P (0 < )
2. P 4 < 3
4
3. P ( = x)
1

1.2 Counting and Combinatorics
Sometimes its possible to calculate probabilities just by counting. For this, we
need Combinatorics the branch of mathematics dealing with combinations
of objects belonging to a finite set in accordance with certain constrai

5
Expectation and Variance
5.1
Introduction
These are two main summary statistics for a random variable.
Expectation a measure of the location/mean average of the random variable.
Other measures of location are the mode and median.
Variance a measure of

7
Continuous Random variables
7.1
Introduction
Example 7.1.1
Spin a pointer and measure the angle clockwise from 12 oclock. Let denote the resulting angle
0 < 2.
Questions: Find the following probabilities:
1. P (0 < )?
2. P
4
<
3
4
?
3. P ( = x)?
Suppose

6. Discrete Distributions
A discrete random variable is one where you can count
the different values it may take. For example, if X can
take any whole number value, then it is discrete.
6.1 Bernoulli Distribution
This is a random variable with only two po

8.2 The central limit theorem
This says that (roughly)
Big sums of random variables are often Normally distributed.
http:/www.youtube.com/watch?v=9xUBhhM4vbM
1
More precisely: let X1, X2, . . . , Xn be n independent
and identically distributed random vari

8. Normal Distribution
A random variable X is said to have a normal distribution with mean and variance 2 (or equivalently:
standard deviation ), written X N (, 2), if X has
pdf given by
1
1
f (x) =
for < x < .
exp 2 (x )2
2
2
So the mean and variance ar

6.4 Negative Binomial
Again suppose that we perform independent Bernoulli trials, each
with probability p of success. If X is the total number of trials
required until we have r successes, then X has a Negative Binomial
distribution. The shorthand is X N

Answer. Insufficient information to give a full answer.
The correct answer will lie somewhere between 11
23 and
1.
Let A, B and C denote the events the randomly chosen student plays football, plays the piano and studies
engineering, respectively. Then
7
1

2
Probability
Definition 2.1.1 A probability measure is a real valued set function defined on the events of a sample
space (set of all possible outcomes) which satisfies the following 3 axioms. The probability of any event
E in the sample space is denoted

2. Probability
Recall from last week: we wish to assign probabilities
to events. So, define F to be the set of all events.
Definition 2.1.1
A probability measure is a function
P :F R
which satisfies the axioms of probability.
1
Axioms of Probability
A1 Pr

1. Basic Concepts
We have a Conceptional Experiment, E.
The Sample space , for an experiment E is the set of all possible outcomes.
An event related to our experiment is a subset of . That is, an event is a collection of outcomes.
Elementary events are th

7.3 Exponential Distribution
A continuous random variable X with pdf given by
x
e
if x > 0
fX (x) =
0
if x 0
is said to have an exponential distribution with parameter > 0,
written X Exp(). By integrating, its cdf is given by
1 ex if x > 0
FX (x) =
0
if

Properties of expectation and variance
1. For independent random variables W and Z,
E[W Z] = E[W ]E[Z]
2. Expectation: Linearity. Let
n
n
X
X
aiXi,
Xi ,
Z=
Y =
W =
i=1
i=1
n
X
gi(Xi)
i=1
Then
E[Y ] =
n
X
i=1
E[Xi],
E[Z] =
n
X
i=1
1
aiE[Xi],
E[W ] =
n
X
i=

2
Probability
Definition 2.1.1 A probability measure is a real valued set function defined on the events of a sample
space (set of all possible outcomes), which satisfies the following 3 axioms. The probability of any event
E in the sample space is denote