Unformatted text preview: u need help with this notion, see section A.1 of the appendix.
Turning now to the formal deﬁnition, we say that Xn is a discrete time
Markov chain with transition matrix p(i, j ) if for any j, i, in 1 , . . . i0
P (Xn+1 = j Xn = i, Xn 1 = in 1 , . . . , X0 = i0 ) = p(i, j ) (1.1) Here and in what follows, boldface indicates a word or phrase that is being
deﬁned or explained.
Equation (1.1) explains what we mean when we say that “given the current
state Xn , any other information about the past is irrelevant for predicting
1 2 CHAPTER 1. MARKOV CHAINS Xn+1 .” In formulating (1.1) we have restricted our attention to the temporally
homogeneous case in which the transition probability
p(i, j ) = P (Xn+1 = j Xn = i) does not depend on the time n.
Intuitively, the transition probability gives the rules of the game. It is
the basic information needed to describe a Markov chain. In the case of the
gambler’s ruin chain, the transition probability has
p(i, i + 1) = 0.4, p(i, i 1) = 0.6,
p(0, 0) = 1
p(N, N ) = 1 if 0 < i &l...
View
Full
Document
This document was uploaded on 03/06/2014 for the course MATH 4740 at Cornell.
 Spring '10
 DURRETT
 The Land

Click to edit the document details