1.5 Independent random variables.
1.5.1 Two random variables.
In example 3.1 it is not hard to see that rolling a 3 on the second roll is independent of rolling a 5 on the
first roll, i.e. the fact that we know that a 5 turned up on the first roll doesn't
1.6 Stochastic processes and repeated independent trials
1.6.1 Stochastic processes.
Example 6.1. Let us generalize example 3.1 by supposing we roll a die n times. The outcomes can
now be described by a vector (x1, ., xn) where x1, ., xn can each be 1, 2,
1.3 Random variables. 1.3.1 Random variables.
Example 3.1. We roll a die twice (or roll two dice together). The outcome consists of observing the result of the first roll together with the result of the second roll. For example, we might get a 3 on the fi
1
Probability
1.1 Probabilities of outcomes and events.
1.1.1 Probabilities of outcomes.
This chapter is an introduction to some of the basic concepts of probability. Let's begin by considering
what we mean by probability.
Example 1.1. Suppose we roll a s
1.2 Conditional probability and independent events.
1.2.1 Conditional probability.
Conditional probabilities are a way to adjust the probability of something happening according to how
much information one has.
Example 2.1. Suppose a friend rolls a die as
1.7 Permuations and combinations 1.7.1 Permutations.
Sometimes when we make k observations (i1, ., ik) we are only interested in the case where i1, ., ik are all distinct. Example 7.1. How many 3 letter words can we make where all 3 letters are distinct?
2.6 Probability of Reaching a State
2.6.1 Probability of Reaching a State
In the previous section we calculated
the probability that the system will have been in a certain state by a certain time?
or
the probability that the system will be in a certain st
1.10 Sums and Other Functions of Random Variables
We have seen some examples of sums of random variables.
Example 1. We roll a die twice. Let X1 = result of first roll and X2 = result of second roll. Let T
be the sum of the two rolls. Then T is the sum of
4.3 The Chapman-Kolmogoroff Equations.
The Chapman-Kolmogoroff equations are simplest to write down in matrix form. It simply says that the
transition matrices P(t) satisfy the following relation.
(1)
P(t+s) = P(s) P(t)
for all s, t 0. This single matrix