This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Introduction to Stochastic Processes, Spring 2011 Homework #4 Due: Thursday, February 10th, 2011 in class Stationary Distributions for Markov Chains 1. Consider a twostate Markov Chain with the states labelled 0 and 1. Let the transition matrix be 1 1 where 0 < < 1 and 0 < < 1. The first row is for state 0, the second is for state 1. (i) Compute the stationary distribution = [ 1 ]. (ii) For n 1, let f ( n ) = P (the first return of the Markov chain to state 0 is at time n  X = 0) . Compute f ( n ) for n 1. (iii) Let m be the mean number of time steps it takes for the Markov chain to return to zero, given that the chain starts at position zero. Use the formula for f ( n ) to compute m . Check that = 1 /m . 2. Recall the umbrella problem of Assignment 3: A man has a total of N umbrellas that he splits between home and work. Before travelling from one place to the other, he looks out the window and if its raining he takes an umbrella with him, otherwise he leaves without one. At any given time there is a 20% chance that it is raining. Suppose that at time zero he is at home, at time one he is at work, at time two hes back home and so on, and let X j ,j 0 be the number...
View
Full
Document
 Spring '11
 nomane
 Markov Chains

Click to edit the document details