Harvard SEAS
ES250 Information Theory
Homework 1 (Due Date: Oct. 2 2007)
1. Let p(x, y ) be given by XY 0 1 0 1/3 0 1 1/3 1/3
Evaluate the following expressions: (a) H (X ), H (Y ) (b) H (X |Y ), H (Y |X ) (c) H (X, Y ) (d) H (Y ) H (Y |X ) (e) I (X ; Y )

Harvard SEAS
ES250 Information Theory
Homework 6 (Due Date: Jan. 8 2007)
1. Consider the ordinary additive noise Gaussian channel with two correlated looks at X , i.e., Y = (Y1 , Y2 ), where Y1 = X + Z 1 Y2 = X + Z 2 with a power constraint P on X , and (

Harvard SEAS
ES250 Information Theory
Homework 2 Solutions
1. An n-dimensional rectangular box with sides X1 , X2 , , Xn is to be constructed. The volume is 1/n Vn = n Xi . The edge-length l of an n-cube with the same volume as the random box is l = Vn .

Harvard SEAS
ES250 Information Theory
Homework 3 Solutions
1. Let X p(x), x = 1, 2, , m, denote the winner of a horse race. Suppose the odds o(x) are fair with respect to p(x), i.e., o(x) = p(1 ) . Let b(x) be the amount bet on horse x, b(x) 0, m b(x) = 1

EE 376A Prof. T. Weissman
Information Theory Thursday, January 14, 2010
Homework Set #1 (Due: Thursday, January 21, 2010) 1. Coin ips. A fair coin is ipped until the rst head occurs. Let X denote the number of ips required. (a) Find the entropy H (X ) in

EE 376A/Stat 376A Prof. T. Weissman
Information Theory Friday, March 17, 2006
Solutions to Practice Final Problems
These problems are sampled from a couple of the actual nals in previous years. 1. (20 points) Errors and erasures. Consider a binary symmetr

EE 376A/Stat 376A Prof. T. Weissman
Information Theory Thursday, March 23, 2006
Final 1. (20 points ) Three Shannon codes Let cfw_Ui i1 be a stationary 1st-order Markov source whose alphabet size is r. Note that the stationarity property implies that P (u

Harvard SEAS
ES250 Information Theory
Entropy, relative entropy, and mutual information
1
1.1
Entropy
Entropy of a random variable
Denition The entropy of a discrete random variable X with pmf pX (x) is H (X ) =
x
p(x) log p(x)
The entropy measures the e

Harvard SEAS
ES250 Information Theory
Asymptotic Equipartition Property (AEP) and Entropy rates
1
1.1
Asymptotic Equipartition Property
Preliminaries
Denition (Convergence of random variables) We say that a sequence of random variables X 1 , X2 , . . . ,

Solutions to Practice Final 1. Human code Give a Human encoding into an alphabet of size D = 4 of the following probability mass function: p=( 87654321 ,) 36 36 36 36 36 36 36 36
Solution: Human code (1) (2) (3) (00) (01) (02) (030) (031) (dummy) (dummy)

Harvard SEAS
ES250 Information Theory
Gambling and Data Compression
1
1.1
Gambling
Horse Race
Denition The wealth relative S (X ) = b(X )o(X ) is the factor by which the gamblers wealth grows if horse X wins the race, where b(X ) is the fraction of the ga

Harvard SEAS
ES250 Information Theory
Channel Capacity
1
1.1
Preliminaries and Denitions
Preliminaries and Examples
Communication between A (the sender) and B (the receiver) is succesful when both A and B agree on the content of the message. A communicat

Harvard SEAS
ES250 Information Theory
Dierential Entropy and Maximum Entropy
1
1.1
Dierential Entropy
Denitions
Denition The dierential entropy h(X ) of a continuous random variable X with density f (x) is dened as h(X ) = f (x) log f (x)dx,
S
where S is

EE 376A Information Theory Prof. T. Weissman
Handout #1 Monday, January 04, 2010 NOT due
Homework Set #0 Note: HW0 has NO eect on your grade of EE376A. There is no requirement to hand it in. Those are warm-up exercises in probability. However, we would li

EE 376A Prof. T. Weissman
Information Theory Thursday, January 14, 2010
Homework Set #1 (Due: Thursday, January 21, 2010) 1. Coin ips. A fair coin is ipped until the rst head occurs. Let X denote the number of ips required. (a) Find the entropy H (X ) in

EE 376A Prof. T. Weissman
Information Theory Thursday, January 21, 2010
Homework Set #2 (Due: Thursday, January 28, 2010) 1. Prove that (a) Data processing decreases entropy: If Y = f (X ) then H (Y ) H (X ). [ Hint: expand H (f (X ), X ) in two dierent w

EE 376A/Stat 376A Prof. T. Weissman
Information Theory Friday, March 17, 2006
Practice Final Problems
These problems are sampled from a couple of the actual nals in previous years. 1. (20 points) Errors and erasures. Consider a binary symmetric channel (B

EE 376A/Stat 376A Prof. T. Weissman
Information Theory Monday, February 5, 2007
Sample Midterm 1. (25 points ) True or False? If the inequality is true, prove it, otherwise, give a counterexample: (a) H (X, Y |Z ) H (X |Z ) (b) H (X |Z ) H (Z ) (c) H (X,

EE 376A/Stat 376A Prof. T. Weissman
Information Theory Tuesday, February 6, 2007
Sample Midterm Solution 1. (25 points) True or False? If the inequality is true, prove it, otherwise, give a counterexample: (a) H(X, Y |Z) H(X|Z) (b) H(X|Z) H(Z) (c) H(X, Y,

EECS 229A * Solutions to Homework 4 1. Problem 7.5 on pg. 224 of the text. Solution: Using two channels at once
Spring 2007 *
To nd the capacity of the product channel we must nd the distribution p(x1 , x2 ) on the input alphabet X1 X2 which maximizes I (

EE 376A/Stat 376A Information Theory Prof. T. Cover
Handout #32 Thursday, March 12, 2009
Solutions to Practice Final Examination
(Note: When the solutions refer to particular homework problems, it is speaking with respect to homework problems that were as

Harvard SEAS
ES250 Information Theory
Gaussian Channel
1 Denitions
Denition (Gaussian channel) Discrete-time channel with input Xi , noise Zi , and output Yi at time i. This is Yi = Xi + Zi , where the noise Zi is drawn i.i.d. from N (0, N ) and assumed t

Harvard SEAS
ES250 Information Theory
Network Information Theory
1 Gaussian Multiple-User Channels
Denition The basic discrete-time AWGN channel with input power P and noise variance N is modeled by Yi = X i + Z i , i = 1, 2,
where i is the time index an

EE 376B Information Theory Prof. T. Cover
Handout #21 Thursday, June 3, 2010
Homework Set #7 1. Minimax regret data compression and Channel Capacity. First consider universal data compression with respect to four source distributions. Let the alphabet V =

Mathematical methods in communication
2nd Semester 2009
Homework Set #4 Channel and Source coding 1. Lossless source coding with side information. Consider the lossless source coding with side information that is available at the encoder and decoder, wher

Information theoryhomework exercises
Edited by: Gbor Lugosi a
1
Entropy, source coding
Problem 1 (Alternative definition of unique decodability) An f : X Y code is called uniquely decodable if for any messages u = u1 uk and v = v1 vk (where u1 , vi , . .

EE 376A Information Theory Prof. T. Cover
Handout #27 Thursday, February 26, 2009 Due Tuesday, March 10, 2009
Homework Set #8
1. Source and channel. We wish to encode a Bernoulli() process V1 , V2 , . . . for transmission over a binary symmetric channel w

EE 376A/Stat 376A Prof. T. Weissman
Information Theory Thursday, February 8, 2007
Midterm 1. (35 points) Throwing a die Suppose you have a die with three sides. given as 1, 2, X= 3,
The probability of outcome of each side is w.p. 1/2 w.p. 1/3 w.p. 1/6
You

EE 376A Prof. T. Weissman
Information Theory Thursday, January 21, 2010
Homework Set #2 (Due: Thursday, January 28, 2010) 1. Prove that (a) Data processing decreases entropy: If Y = f (X ) then H (Y ) H (X ). [ Hint: expand H (f (X ), X ) in two dierent w

Harvard SEAS
ES250 Information Theory
Homework 1 Solution
1. Let p(x, y ) be given by XY 0 1 0 1/3 0 1 1/3 1/3
Evaluate the following expressions: (a) H (X ), H (Y ) (b) H (X |Y ), H (Y |X ) (c) H (X, Y ) (d) H (Y ) H (Y |X ) (e) I (X ; Y ) (f) Draw a Ven