Chapter 2: Entropy, relative Entropy and Mutual Information
Homework 1 and Solutions (Date: 01/30/2014)
2.1 Coin Flips. A fair coin is ipped until the rst head occurs. Let X denote the number of ips r
Chapter 8: Differential Entropy
Homework 7 and Solutions
8.1 Differential entropy. Evaluate the differential entropy h(X) = f ln f for the following:
(a) The exponential density, f (x) = ex , x 0.
(b)
Chapter 7: Channel Capacity
Homework 5 and Solutions
7.1 Preprocessing the output. One is given a communication channel with transition probabilities
p(y|x) and channel capacity C = maxp(x) I(X; Y ).
Chapter 2: Entropy, relative Entropy and Mutual Information
Homework 2 and Solutions
2.14 Entropy of a sum. Let X and Y be random variables that take on values x1 , x2 , . . . , xr and
y1 , y2 , . . .
ECE 5311: Information Theory & Coding
Homework 8: Due at the start of class on 4-Apr.
Spring 2016
1. 9.1
2. 9.2 (Solving
max I(X; Y1 , Y2 ) will be sucient)
Ecfw_X 2 P
3. 9.4 (While not specified in t
A Proposal to study Compressed Sensing
Zhang Guo
February 2016
Introduction
Nowadays, compressed data constructs the modern information society and most of
the data we have acquired is useless. The le
Ciphertext Frequency Plaintext
r
0.129091 E
k
0.098182 T
v
0.096364 O
f
0.08 A
h
0.074545 S
a
0.061818 H
s
0.056364 N
o
0.052727 D
e
0.047273 R
m
0.045455 L
d
0.043636 I
t
0.034545 W
y
0.030909 P
u
0.
ECE 5311: Information Theory & Coding
Homework 4: Due at the start of class on 22-Feb.
1. 3.1
2. 3.2
3. 3.6
4. 3.9
5. 5.4
6. 5.14
7. 5.25
8. 5.32
Spring 2016
Chapter 3:The Asymptotic Equipartition Property
Homework 4 and Solutions
3.1 Markovs inequality and Chebyshevs inequality.
(a) (Markovs inequality.) For any non-negative random variable X and any t >
Chapter 7: Channel Capacity
Homework 6 and Solutions
7.19 Capacity of the carrier pigeon channel. Consider a commander of an army besieged a fort
for whom the only means of communication to his allies
Chapter 9: Gaussian Channel
Homework 8 and Solutions
9.1 A channel with two independent looks at Y. Let Y1 and Y2 be conditionally independent and
conditionally identically distributed given X .
(a) S
Chapter 2: Entropy, relative Entropy and Mutual Information
Homework 2 and Solutions (Date: 02/06/2014)
2.14 Entropy of a sum. Let X and Y be random variables that take on values x1 , x2 , . . . , xr
Chapter 2: Entropy, relative Entropy and Mutual Information
Homework 3 and Solutions (Date: 02/20/2014)
2.27 Grouping rule for entropy: Let p = (p1 , . . . , pm ) be a probability distribution on m el
Chapter 7: Channel Capacity
Homework 5 and Solutions (Date: 03/20/2014)
7.1 Preprocessing the output. One is given a communication channel with transition probabilities
p(y|x) and channel capacity C =
Chapter 3:The Asymptotic Equipartition Property
Homework 3 and Solutions (Date: 02/27/2014)
3.1 Markovs inequality and Chebyshevs inequality.
(a) (Markovs inequality.) For any non-negative random vari
Chapter 2: Entropy, relative Entropy and Mutual Information
Homework 1 and Solutions
2.1 Coin Flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips required.
(a)
Chapter 2: Entropy, relative Entropy and Mutual Information
Homework 3 and Solutions
2.27 Grouping rule for entropy: Let p = (p1 , . . . , pm ) be a probability distribution on m elements, i.e.,
pi 0,