This preview shows pages 1–3. Sign up to view the full content.
1
Colorado State University, Ft. Collins
Fall 2008
ECE 516: Information Theory
Lecture 8 September 23, 2008
Recap:
Theorem:
Let
n
X
X
,
,
1
L
be iid with
( )
x
p
and
( )
X
H
. Let
0
>
ε
. Then, there
exists a code which maps sequences
( )
n
x
x
,
,
1
L
of length
n
into binary strings (of
variable lengths), such that the mapping is onetoone (and therefore invertible) and
()
+
≤
⎥
⎦
⎤
⎢
⎣
⎡
X
H
n
X
l
E
n
4
Entropy Rates of Stochastic Processes (Chp. 4)
Definition: A stochastic process
{ }
n
X
is stationary
if the joint PMF of any
k
samples is invariant with respect to any amount of time shift, i.e.,
[]
[ ]
k
l
nk
l
n
l
n
k
nk
n
n
x
X
x
X
x
X
P
x
X
x
X
x
X
P
=
=
=
=
=
=
=
+
+
+
,
,
,
,
,
,
2
2
1
1
2
2
1
1
L
L
for any
k
n
n
,
,
1
L
, any
, any
l
, and
X
∈
k
x
x
,
,
1
L
.
Definition: A stochastic process
{ }
n
X
is ergodic
if its time average (sample
mean) is equal to its actual mean (ensemble mean), i.e.,
x
n
i
i
n
X
n
μ
→
∑
=
∞
→
1
1
lim
w.p. 1
Definition: The entropy rate of a stochastic process
{ }
n
X
is defined by
n
n
X
X
H
n
H
,
,
1
lim
1
L
∞
→
=
X
when limit exists.
Theorem:
For a stationary stochastic process
{ }
n
X
, the two limits
X
H
and
X
H
′
exist, and are equal, i.e.,
( ) ( )
X
X
H
H
′
=
Lemma:
(Cesaro Mean) If
a
a
n
n
→
∞
→
lim
and
∑
=
=
n
i
i
n
a
n
b
1
1
, then
a
b
n
n
→
∞
→
lim
.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document2
ShannonMcMillanBreiman Theorem:
For a stationary and ergodic process
{}
n
X
,
()
(
)
X
H
X
X
p
n
n
→
−
,
,
log
1
1
L
w.p. 1
Definition: A stochastic process
{ }
n
X
is a Markov chain
if
[]
[ ]
1
1
1
1
1
1

,
,

−
−
−
−
=
=
=
=
=
=
n
n
n
n
n
n
n
n
x
X
x
X
P
x
X
x
X
x
X
P
L
for all
X
∈
n
x
x
,
,
1
L
Definition: The Markov chain is time invariant if
[ ]
i
X
j
X
P
i
X
j
X
P
n
n
=
=
=
=
=
−
1
2
1


for all
X
∈
j
i
,
.
Definition: For a time invariant Markov chain
[ ]
i
X
j
X
P
P
n
n
ij
=
=
=
−
1

is called the state transition probability
from state
i
to state
j
.
The
X
X
×
matrix
[ ]
ij
P
=
P
is called the state transition matrix.
Properties:
0
≥
ij
P
,
1
=
∑
j
ij
P
.
Definition: A Markov chain is irreducible
if we can go from any state to any
other state with positive probability in a finite number of steps. I.e.,
0

>
=
=
+
i
X
j
X
P
n
k
n
for all
j
i
,
and for some
k
.
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '08
 Rocky

Click to edit the document details