Unformatted text preview: P ( X n & 1 = i n & 1 ;: :: ;X = i j X n = i ) 3. Let X = f X n g be a Markov chain with state space S and transition probability matrix P = [ P ij ] . Let Y be a Markov chain with state space W and transition probability matrix Q = [ Q kl ] . Furthermore, assume that X and Y are independent. Then the two-dimensional process Z = f Z n g with Z n : = ( X n ;Y n ) is also a Markov chain (you do not need to prove this, but at least try to understand it intuitively). (a) Identify the state space of Z . (b) Identify the transition probability matrix of Z . 1...
View Full Document
- Spring '10
- Probability theory, Stochastic process, Markov chain, Random walk, transition probability matrix, 1 1 0g