Determine the transition probabilities for the

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: r 1, 2010­11 2. Determine the transition probabilities for the following Markov chains. a) Suppose the probability of getting a head when tossing a coin is . Consider the results from tossing the coin successively, the state of the process at time (after the toss) is the number of heads minus the number of tails in the tosses. Let , Given that the 1 1,2, … be a Markov chain, , 1 if the 1 toss is a head and toss is a tail. Hence, the transition probability is if 1 1 if 1 otherwise 0 In matrix form, 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 1 if 0 0 0 0 1 0 b) Urn 1 contains black balls and urn 2 contains white balls. At each stage, a ball is selected at random from each urn and placed into the other urn at the same time. The state of the process is the number of white balls in urn 1. Obviously, 1 and...
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online