1.
Read pages 192, 193 of the textbook.
2.
Read the remark on page 200 of the textbook.
3.
Read Proposition 4.3.
4.
[15 points] A stochastic process
{
X
n
,n
= 0
,
1
,
2
,
···}
with state space
S
=
{
0
,
1
,
2
,
···
,l
}
is called a Markov chain if for any
i,j,i
0
,i
1
,
···
,i
n

1
∈
S
and
n
≥
0,
P
{
X
n
+1
=
j

X
n
=
i,X
n

1
=
i
n

1
,
···
,X
1
=
i
1
,X
0
=
i
0
}
=
P
{
X
1
=
j

X
0
=
i
}
.
a) Use this deﬁnition to prove that for the Markov chain deﬁned as above, and for
n > k
≥
0
P
{
X
n
=
j

X
k
=
i,X
k

1
6
=
i
k

1
,
···
,X
1
6
=
i
1
,X
0
6
=
i
0
}
=
P
{
X
n

k
=
j

X
0
=
i
}
.
b) For the Markov chain deﬁned as above and for
n > k
≥
0, does
P
{
X
n
=
j

X
k
6
=
i,X
k

1
=
i
k

1
,
···
,X
1
=
i
1
,X
0
=
i
0
}
=
P
{
X
n

k
=
j

X
0
6
=
i
}
?
Prove it or disprove it.
5.
[20 points] Consider a Markov chain with 4 states
{
0
,
1
,
2
,
3
}
. The transition proba
bility matrix takes the following form:
2
/
5 1
/
5 1
/
5
a
1
/
5 2
/
5 1
/
5
b
1
/
5 1
/
5 2
/
5