2.2 The Powers of the Transition Matrix
2.2.1 A Formula for the Powers of the Transition Matrix
Last time we saw that probabilities of future events in a Markov chain can be computed from the powers
Pn of the transition matrix P. One way to compute Pn is
2
Markov Chains
2.1 Basic Principles.
2.1.1 The Transition Matrix.
A stochastic process is a mathematical model of a situation in the real world that evolves in time in a
probabilistic fashion, i.e. we don't seem to be able to completely predict the futur
2.5 Hitting Times
2.5.1 Hitting Times
Sometimes we want to know things like
What is the probability that the system will have been in a certain state by a certain time?
or
What is the probability that the system will be in a certain state for the first ti
1.12 Single Period Inventory
We buy some item wholesale and sell it retail. The demand for the item varies, so we treat it a random variable.
We want to know how many to buy wholesale so as to maximize the expected profit.
Example 1: Each day a newsstand
2.7 Number of Visits to a State
2.6.1 Number of Visits to a State
In the previous section we calculated the probability of reaching a state. In this section we want to
calculate the expected number of visits to a state at all in the case in cases where th
2.3 Steady State Probabilities
2.3.1 Finding Steady State Probabilities
In the previous section we saw how to compute the powers Pn of the transition matrix P. We saw that
each element of Pn was a constant plus a sum of multiples of powers of numbers i wh
1.11 Averages and Expected Values of Random Variables
One thing that we do frequently is compute the average of a series of related measurement.
Example 1. You are a wholesaler for gasoline and each week you buy and sell gasoline. Naturally
you are intere