If either joni or tony gets the ball they keep

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ion probability p(i, i + 1) = (N i)/N , and p(i, i 1) = i/N for 0 i N . 72 CHAPTER 1. MARKOV CHAINS Let µn = Ex Xn . (a) Show that µn+1 = 1 + (1 2/N )µn . (b) Use this and induction to conclude that ✓ ◆n N 2 µn = +1 (x N/2) 2 N From this we see that the mean µn converges exponentially rapidly to the equilibrium value of N/2 with the error at time n being (1 2/N )n (x N/2). 1.54. Prove that if pij > 0 for all i and j then a necessary and su cient condition for the existence of a reversible stationary distribution is pij pjk pki = pik pkj pji for all i, j, k Hint: fix i and take ⇡j = cpij /pji . Exit distributions and times 1.55. The Markov chain associated with a manufacturing process may be described as follows: A part to be manufactured will begin the process by entering step 1. After step 1, 20% of the parts must be reworked, i.e., returned to step 1, 10% of the parts are thrown away, and 70% proceed to step 2. After step 2, 5% of the parts must be returned to the step 1, 10% to step 2, 5% are scrapped, and 80% emerge to be sold for a profit. (a) Formula...
View Full Document

Ask a homework question - tutors are online