This preview shows page 1. Sign up to view the full content.
Unformatted text preview: P ( X n & 1 = i n & 1 ;: :: ;X = i j X n = i ) 3. Let X = f X n g be a Markov chain with state space S and transition probability matrix P = [ P ij ] . Let Y be a Markov chain with state space W and transition probability matrix Q = [ Q kl ] . Furthermore, assume that X and Y are independent. Then the two-dimensional process Z = f Z n g with Z n : = ( X n ;Y n ) is also a Markov chain (you do not need to prove this, but at least try to understand it intuitively). (a) Identify the state space of Z . (b) Identify the transition probability matrix of Z . 1...
View Full Document
This note was uploaded on 02/24/2010 for the course APMA 1200 taught by Professor Roz during the Spring '10 term at Brown.
- Spring '10