1.
Read pages 192, 193 of the textbook.
2.
Read the remark on page 200 of the textbook.
3.
Read Proposition 4.3.
4.
[15 points] A stochastic process
{
X
n
,n
= 0
,
1
,
2
,
···}
with state space
S
=
{
0
,
1
,
2
,
···
,l
}
is called a Markov chain if for any
i,j,i
0
,i
1
,
···
,i
n

1
∈
S
and
n
≥
0,
P
{
X
n
+1
=
j

X
n
=
i,X
n

1
=
i
n

1
,
···
,X
1
=
i
1
,X
0
=
i
0
}
=
P
{
X
1
=
j

X
0
=
i
}
.
a) Use this deﬁnition to prove that for the Markov chain deﬁned as above, and for
n > k
≥
0
P
{
X
n
=
j

X
k
=
i,X
k

1
6
=
i
k

1
,
···
,X
1
6
=
i
1
,X
0
6
=
i
0
}
=
P
{
X
n

k
=
j

X
0
=
i
}
.
b) For the Markov chain deﬁned as above and for
n > k
≥
0, does
P
{
X
n
=
j

X
k
6
=
i,X
k

1
=
i
k

1
,
···
,X
1
=
i
1
,X
0
=
i
0
}
=
P
{
X
n

k
=
j

X
0
6
=
i
}
?
Prove it or disprove it.
Solution:
a)
P
{
X
n
=
j

X
k
=
i,X
k

1
6
=
i
k

1
,
···
,X
1
6
=
i
1
,X
0
6
=
i
0
}
=
∑
j
n

1
,
···
,j
k
+1
,j
k

1
6
=
i
k

1
,
···
,j
0
6
=
i
0
P
{
X
n
=
j,X
n

1
=
j
n

1
,
···
,X
k
+1
=
j
k
+1
,X
k
=
i,X
k

1
=
j
k

1
,
···
,X
0
=
j
0
}
P
{
X
k
=
i,X
k

1
6
=
i
k

1
,
···
,X
0
6
=
i
0
}
=
∑
j
n

1
,
···
,j
k
+1
,j
k

1
6
=
i
k

1
,
···
,j
0
6
=
i
0
P
{
X
n
=
j,X
n

1
=
j
n

1
,
···
,X
k
+1
=
j
k
+1

X
k
=
i
}
P
{
X
k
=
i,X
k

1
=
j
k

1
,
···
,X
0
=
j
0
}
P
{
X
k
=
i,X
k

1
6
=
i
k

1
,
···
,X
0
6
=
i
0
}
=
∑
j
n

1
,
···
,j
k
+1
P
{
X
n
=
j,X
n

1
=
j
n

1
,
···
,X
k
+1
=
j
k
+1

X
k
=
i
}
∑
j
k

1
6
=
i
k

1
,
···
,j
0
6
=
i
0
P
{
X
k
=
i,X
k

1
=
j
k

1
,
···
,X
0
=
j
0
}
P
{
X
k
=
i,X
k

1
6
=
i
k

1
,
···
,X
0
6
=
i
0
}
=
X
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Spring '10
 Han
 Probability, Markov chain, nd

Click to edit the document details