This preview shows page 1. Sign up to view the full content.
Unformatted text preview: t; N When N = 5 the matrix is
0 1 1.0
0.6
0
0
0
0 0
1
2
3
4
5 0
0
0.6
0
0
0 2 3 0
0
0.4 0
0 0.4
0.6 0
0 0.6
0
0 4 5 0
0
0
0
0
0
0.4 0
0 0.4
0 1.0 or the chain by be represented pictorially as
.4
!0
1 .4
.4
.4
1
1!2!3!4!5
.6
.6
.6
.6 Example 1.2. Ehrenfest chain. This chain originated in physics as a model
for two cubical volumes of air connected by a small hole. In the mathematical
version, we have two “urns,” i.e., two of the exalted trash cans of probability
theory, in which there are a total of N balls. We pick one of the N balls at
random and move it to the other urn.
Let Xn be the number of balls in the “left” urn after the nth draw. It should
be clear that Xn has the Markov property; i.e., if we want to guess the state
at time n + 1, then the current number of balls in the left urn Xn , is the only
relevant information from the observed sequence of states Xn , Xn 1 , . . . X1 , X0 .
To check this we note that
P (Xn+1 = i + 1Xn = i, Xn 1 = in 1 , . . . X0 = i0 ) = (N i)/N since to increase the number we ha...
View
Full
Document
 Spring '10
 DURRETT
 The Land

Click to edit the document details