1.5 Independent random variables.
1.5.1 Two random variables.
In example 3.1 it is not hard to see that rolling a 3 on the second roll is independent of rolling a 5 on the
first roll, i.e. the fact th
1.6 Stochastic processes and repeated independent trials
1.6.1 Stochastic processes.
Example 6.1. Let us generalize example 3.1 by supposing we roll a die n times. The outcomes can
now be described by
1.3 Random variables. 1.3.1 Random variables.
Example 3.1. We roll a die twice (or roll two dice together). The outcome consists of observing the result of the first roll together with the result of t
1
Probability
1.1 Probabilities of outcomes and events.
1.1.1 Probabilities of outcomes.
This chapter is an introduction to some of the basic concepts of probability. Let's begin by considering
what w
1.2 Conditional probability and independent events.
1.2.1 Conditional probability.
Conditional probabilities are a way to adjust the probability of something happening according to how
much informatio
1.7 Permuations and combinations 1.7.1 Permutations.
Sometimes when we make k observations (i1, ., ik) we are only interested in the case where i1, ., ik are all distinct. Example 7.1. How many 3 lett
2.6 Probability of Reaching a State
2.6.1 Probability of Reaching a State
In the previous section we calculated
the probability that the system will have been in a certain state by a certain time?
or
1.10 Sums and Other Functions of Random Variables
We have seen some examples of sums of random variables.
Example 1. We roll a die twice. Let X1 = result of first roll and X2 = result of second roll.
4.3 The Chapman-Kolmogoroff Equations.
The Chapman-Kolmogoroff equations are simplest to write down in matrix form. It simply says that the
transition matrices P(t) satisfy the following relation.
(1)