1
Colorado State University, Ft. Collins
Fall 2008
ECE 516: Information Theory
Lecture 8 September 23, 2008
Recap:
Theorem:
Let
n
X
X
,
,
1
L
be iid with
( )
x
p
and
(
)
X
H
. Let
0
>
ε
. Then, there
exists a code which maps sequences
(
)
n
x
x
,
,
1
L
of length
n
into binary strings (of
variable lengths), such that the mapping is one-to-one (and therefore invertible) and
(
)
(
)
ε
+
≤
⎥
⎦
⎤
⎢
⎣
⎡
X
H
n
X
l
E
n
4
Entropy Rates of Stochastic Processes (Chp. 4)
Definition: A stochastic process
{
}
n
X
is stationary
if the joint PMF of any
k
samples is invariant with respect to any amount of time shift, i.e.,
[
]
[
]
k
l
nk
l
n
l
n
k
nk
n
n
x
X
x
X
x
X
P
x
X
x
X
x
X
P
=
=
=
=
=
=
=
+
+
+
,
,
,
,
,
,
2
2
1
1
2
2
1
1
L
L
for any
k
n
n
,
,
1
L
, any
k
, any
l
, and
X
∈
k
x
x
,
,
1
L
.
Definition: A stochastic process
{
}
n
X
is ergodic
if its time average (sample
mean) is equal to its actual mean (ensemble mean), i.e.,
x
n
i
i
n
X
n
μ
→
∑
=
∞
→
1
1
lim
w.p. 1
Definition: The entropy rate of a stochastic process
{
}
n
X
is defined by
(
)
(
)
n
n
X
X
H
n
H
,
,
1
lim
1
L
∞
→
=
X
when limit exists.
Theorem:
For a stationary stochastic process
{
}
n
X
, the two limits
(
)
X
H
and
(
)
X
H
′
exist, and are equal, i.e.,
(
)
(
)
X
X
H
H
′
=
Lemma:
(Cesaro Mean) If
a
a
n
n
→
∞
→
lim
and
∑
=
=
n
i
i
n
a
n
b
1
1
, then
a
b
n
n
→
∞
→
lim
.

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*