31 Therefore p x 1 x n is a stationary probability distribution so provided

31 therefore p x 1 x n is a stationary probability

This preview shows page 31 - 37 out of 52 pages.

31
Image of page 31

Subscribe to view the full document.

Therefore, p ( x 1 , . . . , x n ) is a stationary probability distribution, so provided that the Markov chain is irreducible and aperiodic, we can conclude that it is the limiting probability vector for the Gibbs sampler. It also follows from the proceeding that p ( x 1 , . . . , x n ) would be the limiting probability vector even if the Gibbs sampler were not systematic in first changing the value of X 1 , then X 2 , and so on., Indeed, even if the component whose value was to be changed was always randomly determined, then p ( x 1 , . . . , x n ) would remain a stationary distribution, and would thus be the limiting probability mass function provided that the resulting chain is aperiodic and irreducible. 32
Image of page 32
2 Exercises 1) Each day one of n possible elements is requested; it is the i th one with probability P i , i 1 , n i =1 P i = 1 . These elements are at all times arranged in an ordered list that is revised as follows: the element selected is moved to the front of the list, and the relative positions of all other elements remain unchanged. Define the state at any time to be the ordering of the list at that time. (a) Argue that the above is Markov chain. (b) For any state ( i 1 , . . . , i n ) (which is a permutation of (1 , 2 , . . . , n ) ) let π ( i 1 , . . . , i n ) denote the limiting probability. Argue that π ( i 1 , . . . , i n ) = P i 1 P i 2 1 - P i 1 · · · P i n - 1 1 - P i 1 - · · · - P i n - 2 . 33
Image of page 33

Subscribe to view the full document.

2) Let { X n , n 0 } be a Markov chain with stationary probabilities π j , j 0 . Suppose that X 0 = 0 and define T = min { n > 0 : X n = 0 } . Let Y j = X T - j , j = 0 , 1 , . . . , T . Show that { Y j , j = 0 , . . . , T } is distributed as the states visited by a Markov chain (the “reversed” Markov chain) with transition probabilities P * ij = π j P ji i started in state 0 and watched until it returns to 0. 34
Image of page 34
3) Consider a finite Markov chain on the state space { 0 , 1 , 2 , . . . , N } with transition probability matrix P = ( P ij ) N i,j =0 , and divide the state space into the three classes { 0 } , { 1 , 2 , . . . , N - 1 } and { N } . Let 0 and N be absorbing states, both accessible from all states in 1 , . . . , N - 1 , and let { 1 , 2 , . . . , N - 1 } be a transient class. Let k be a transient state. Define an auxiliary process (the “return process”) with transition matrix e P by altering the first and last row of P so that e P 0 k = e P Nk = 1 and leave the other rows unchanged. The return process is clearly irreducible. Prove that the expected time until absorption μ k with initial state k in the original process equals 1 / ( π 0 + π N ) - 1 where π 0 + π N is the stationary probability of being in state 0 or N for the return process. Hint: use the relation between stationary probabilities and expected recurrence times to states. 35
Image of page 35

Subscribe to view the full document.

3 Reversibility Suppose that { X n : 0 n N } is an irreducible, non-null, persistent Markov chain, with transition matrix P and stationary distribution π . Suppose further that X n has distribution π for every n . Define the ’reversed chain’ Y by Y n = X N - n for 0 n N .
Image of page 36
Image of page 37
  • Spring '08
  • Brown

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Ask Expert Tutors You can ask 0 bonus questions You can ask 0 questions (0 expire soon) You can ask 0 questions (will expire )
Answers in as fast as 15 minutes