Example 110 two stage markov chains in a markov chain

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: arkov chains. In a Markov chain the distribution of Xn+1 only depends on Xn . This can easily be generalized to case in which the distribution of Xn+1 only depends on (Xn , Xn 1 ). For a concrete example consider a basketball player who makes a shot with the following probabilities: 1/2 if he has missed the last two times 2/3 if he has hit one of his last two shots 3/4 if he has hit both of his last two shots To formulate a Markov chain to model his shooting, we let the states of the process be the outcomes of his last two shots: {HH, HM, M H, M M } where M is short for miss and H for hit. The transition probability is HH HM MH MM HH 3/4 0 2/ 3 0 HM 1/4 0 1/3 0 MH 0 2 /3 0 1 /2 MM 0 1/3 0 1/2 To explain suppose the state is HM , i.e., Xn 1 = H and Xn = M . In this case the next outcome will be H with probability 2/3. When this occurs the next state will be (Xn , Xn+1 ) = (M, H ) with probability 2/3. If he misses an event of probability 1/3, (Xn , Xn+1 ) = (M, M ). The Hot Hand is a phenomenon k...
View Full Document

Ask a homework question - tutors are online