Unformatted text preview: solute values jX0 j; jX1 j; : : : to be a Markov chain? 1.6 Definition. We say that a process X0 ; X1 ; : : : is rth order Markov if
PfXn+1 = in+1 j Xn = in ; Xn,1 = in,1 ; : : : ; X0 = i0 g
= PfXn+1 = in+1 j Xn = in ; : : : ; Xn,r+1 = in,r+1 g
for all n r and all i0 ; : : : ; in+1 2 S.
1.7 Exercise A moving average process . Moving average models are used frequently in time series analysis, economics and engineering. For these models, one assumes that there
is an underlying, unobserved process : : : ; Y,1 ; Y0 ; Y1 ; : : : of iid random variables. A moving
average process takes an average possibly a weighted average of these iid random variables
in a sliding window." For example, suppose that at time n we simply take the average of the
Yn and Yn,1 , de ning Xn = 1=2Yn + Yn,1. Our goal is to show that the process X0 ; X1 ; : : :
de ned in this way is not Markov. As a simple example, suppose that the distribution of the
iid Y random variables is PfYi = 1g = 1=2 = PfYi = ,1g.
1. Show that X0 ; X1 ; : : : is not a Markov chain.
2. Show that X0 ; X1 ; : : : is not an rth order Markov chain for any nite r. We will use the shorthand Pi " to indicate a probability taken in a
Markov chain started in state i at time 0. That is, Pi A" is shorthand for PfA j X0 =
ig." We'll also use the notation E i " in an analogous way for expectation. 1.8 Notation. Stochastic Processes J. Chang, March 30, 1999 1. MARKOV CHAINS Page 16 Let fXn g be a nitestate Markov chain and let A be a subset of the
state space. Suppose we want to determine the expected time until the chain enters the set A,
starting from an arbitrary initial state. That is, letting A = inf fn 0 : Xn 2 Ag denote the
rst time to hit A de ned to be 0 if X0 2 A , we want to determine E i A . Show that 1.9 Exercise. E i A = 1 + for i 2 A.
= X
k P i; kE k A 1.10 Exercise. You are ipping a coin repeatedly. Which pattern would you expect to see faster: HH or HT? For example, if you get the sequence TTHHHTH..., then you see HH" at
the 4th toss and HT" at the 6th. Letting N1 and N2 denote the times required to see HH"
and HT", respectively, can you guess intuitively whether E N1 is smaller than, the same as,
or larger than E N2 ? Go ahead, make a guess and my day . Why don't you also simulate
some to see how the answer looks; I recommend a computer, but if you like tossing real coins,
enjoy yourself by all means. Finally, you can use the reasoning of the Exercise 1.9 to solve the
problem and evaluate E Ni . A hint is to set up a Markov chain having the 4 states HH, HT,
TH, and TT. 1.11 Exercise. Here is a chance to practice formalizing some typical intuitively obvious" statements. Let X0 ; X1 ; : : : be a nitestate Markov chain.
a. We start with an observation about conditional probabilities that will be a useful tool
throughout the rest of this problem. Let F1 ; : : : ; Fm be disjoint events. Show that if
S
PE jFi = p for all i = 1; : : : ; m then PE j...
View
Full Document
 Spring '10
 DURRETT
 Multiplication, Markov Chains, Probability theory, Markov chain, J. Chang

Click to edit the document details