EE 250 Information Theory
Introduction to !
Information Measures
Ertem Tuncel
Information
What is information? !
How can we formalize and measure it?!
What gives me more information?
Learning the outcome of fair dice or
loaded dice?
Flip of a Coin
Let
EE 250 Information Theory
Data Compression
Ertem Tuncel
Source Codes
Description: A code is a mapping from
the source alphabet to a sequence of 0s
and 1s:
Example: The canonical mapping.
and
Source Codes
Example: Another mapping:
and
The #1 quality we
EE 250 Information Theory
Entropy Rates of!
Stochastic Processes
Ertem Tuncel
Stochastic Processes
Description: Recall that stochastic
processes are characterized by joint
PMFs of arbitrary size:
Stationarity: For any
,
Markovity: A process is Markov i
EE 250 Information Theory
Channel Coding and
Capacity
Ertem Tuncel
Communication Model
ENCODER
MODULATIO
N
PHYSICAL!
CHANNEL
DECODER
DEMODULATION
Communication Model
ENCODER
CHANNEL
DECODER
We usually deal with memoryless channels:
Some Channel Models
B
EE 250 Information Theory
Typicality
Ertem Tuncel
Empirical Entropy
Consider the normalized log-probability
for an i.i.d. process:
by the weak law of large numbers.
Weakly Typical Sequences
A sequence
typical w.r.t.
is called
if
-weakly
The -weakly typ
EE 250 Information Theory
Lossy Source Coding and
Rate-Distortion Theory
Ertem Tuncel
Lossy Source Coding
ENCODER
DECODER
Goal: Compress
with as little
distortion as possible.
Distortion: A measure of fidelity of the
estimate
.
Single-letter distortion