EE 376A/Stat 376A Prof. T. Weissman
Information Theory Monday, February 5, 2007
Sample Midterm 1. (25 points ) True or False? If the inequality is true, prove it, otherwise, give a counterexample: (a) H (X, Y |Z ) H (X |Z ) (b) H (X |Z ) H (Z ) (c) H (X,
EE 376A/Stat 376A Prof. T. Weissman
Information Theory Tuesday, February 6, 2007
Sample Midterm Solution 1. (25 points) True or False? If the inequality is true, prove it, otherwise, give a counterexample: (a) H(X, Y |Z) H(X|Z) (b) H(X|Z) H(Z) (c) H(X, Y,
EE 376A/Stat 376A Prof. T. Weissman
Information Theory Friday, March 17, 2006
Practice Final Problems
These problems are sampled from a couple of the actual nals in previous years. 1. (20 points) Errors and erasures. Consider a binary symmetric channel (B
EE 376A Prof. T. Weissman
Information Theory Thursday, January 21, 2010
Homework Set #2 (Due: Thursday, January 28, 2010) 1. Prove that (a) Data processing decreases entropy: If Y = f (X ) then H (Y ) H (X ). [ Hint: expand H (f (X ), X ) in two dierent w
EE 376A Prof. T. Weissman
Information Theory Thursday, January 14, 2010
Homework Set #1 (Due: Thursday, January 21, 2010) 1. Coin ips. A fair coin is ipped until the rst head occurs. Let X denote the number of ips required. (a) Find the entropy H (X ) in
EE 376A Prof. T. Weissman
Information Theory Thursday, January 14, 2010
Homework Set #1 (Due: Thursday, January 21, 2010) 1. Coin ips. A fair coin is ipped until the rst head occurs. Let X denote the number of ips required. (a) Find the entropy H (X ) in
EE 376A Information Theory Prof. T. Weissman
Handout #1 Monday, January 04, 2010 NOT due
Homework Set #0 Note: HW0 has NO eect on your grade of EE376A. There is no requirement to hand it in. Those are warm-up exercises in probability. However, we would li
EE 376A/Stat 376A Prof. T. Weissman
Information Theory Thursday, March 23, 2006
Final 1. (20 points ) Three Shannon codes Let cfw_Ui i1 be a stationary 1st-order Markov source whose alphabet size is r. Note that the stationarity property implies that P (u
Harvard SEAS
ES250 Information Theory
Entropy, relative entropy, and mutual information
1
1.1
Entropy
Entropy of a random variable
Denition The entropy of a discrete random variable X with pmf pX (x) is H (X ) =
x
p(x) log p(x)
The entropy measures the e
Harvard SEAS
ES250 Information Theory
Asymptotic Equipartition Property (AEP) and Entropy rates
1
1.1
Asymptotic Equipartition Property
Preliminaries
Denition (Convergence of random variables) We say that a sequence of random variables X 1 , X2 , . . . ,
Harvard SEAS
ES250 Information Theory
Gambling and Data Compression
1
1.1
Gambling
Horse Race
Denition The wealth relative S (X ) = b(X )o(X ) is the factor by which the gamblers wealth grows if horse X wins the race, where b(X ) is the fraction of the ga
Harvard SEAS
ES250 Information Theory
Channel Capacity
1
1.1
Preliminaries and Denitions
Preliminaries and Examples
Communication between A (the sender) and B (the receiver) is succesful when both A and B agree on the content of the message. A communicat
Harvard SEAS
ES250 Information Theory
Dierential Entropy and Maximum Entropy
1
1.1
Dierential Entropy
Denitions
Denition The dierential entropy h(X ) of a continuous random variable X with density f (x) is dened as h(X ) = f (x) log f (x)dx,
S
where S is
Harvard SEAS
ES250 Information Theory
Gaussian Channel
1 Denitions
Denition (Gaussian channel) Discrete-time channel with input Xi , noise Zi , and output Yi at time i. This is Yi = Xi + Zi , where the noise Zi is drawn i.i.d. from N (0, N ) and assumed t
Harvard SEAS
ES250 Information Theory
Network Information Theory
1 Gaussian Multiple-User Channels
Denition The basic discrete-time AWGN channel with input power P and noise variance N is modeled by Yi = X i + Z i , i = 1, 2,
where i is the time index an
Harvard SEAS
ES250 Information Theory
ES250 Course Information
Fall 2007-08 Hours and location
Lectures TTh 12:30pm, Cruft 318
Teaching sta
Instructor Mai Vu [email protected] 617-496-2942 Maxwell-Dworkin 342 Oce Hours: Fridays 1:30-3:00pm (or by app
Harvard SEAS
ES250 Information Theory
Homework 1 (Due Date: Oct. 2 2007)
1. Let p(x, y ) be given by XY 0 1 0 1/3 0 1 1/3 1/3
Evaluate the following expressions: (a) H (X ), H (Y ) (b) H (X |Y ), H (Y |X ) (c) H (X, Y ) (d) H (Y ) H (Y |X ) (e) I (X ; Y )
Harvard SEAS
ES250 Information Theory
Homework 2 (Due Date: Oct. 16 2007)
1. An n-dimensional rectangular box with sides X1 , X2 , , Xn is to be constructed. The volume is 1/n Vn = n Xi . The edge-length l of an n-cube with the same volume as the random b
Harvard SEAS
ES250 Information Theory
Homework 2 Solutions
1. An n-dimensional rectangular box with sides X1 , X2 , , Xn is to be constructed. The volume is 1/n Vn = n Xi . The edge-length l of an n-cube with the same volume as the random box is l = Vn .
Harvard SEAS
ES250 Information Theory
Homework 3 (Due Date: Oct. 25 2007)
1. Let X p(x), x = 1, 2, , m, denote the winner of a horse race. Suppose the odds o(x) are fair with respect to p(x), i.e., o(x) = p(1 ) . Let b(x) be the amount bet on horse x, b(x
Harvard SEAS
ES250 Information Theory
Homework 3 Solutions
1. Let X p(x), x = 1, 2, , m, denote the winner of a horse race. Suppose the odds o(x) are fair with respect to p(x), i.e., o(x) = p(1 ) . Let b(x) be the amount bet on horse x, b(x) 0, m b(x) = 1
Harvard SEAS
ES250 Information Theory
Homework 4 (Due Date: Nov. 13, 2007)
1. Consider a binary symmetric channel with Yi = Xi Zi , where is mod 2 addition, and Xi , Yi cfw_0, 1. Suppose that cfw_Zi has constant marginal probabilities Pr(Zi = 1) = p and
Harvard SEAS
ES250 Information Theory
Homework 6 (Due Date: Jan. 8 2007)
1. Consider the ordinary additive noise Gaussian channel with two correlated looks at X , i.e., Y = (Y1 , Y2 ), where Y1 = X + Z 1 Y2 = X + Z 2 with a power constraint P on X , and (
Harvard SEAS
ES250 Information Theory
Homework 5 (Due Date: Nov. 20 2007)
1. Suppose that (X, Y, Z ) are jointly Gaussian and that X Y Z forms a Markov chain. Let X and Y have correlation coecient 1 and let Y and Z have correlation coecient 2 . Find I (X
Harvard SEAS
ES250 Information Theory
Homework 4 Solutions
1. Find the channel capacity of the following discrete memoryless channel:
Z
X
Y
1 where Prcfw_Z = 0 = Prcfw_Z = a = 2 . The alphabet for x is X = cfw_0, 1. Assume that Z is independent of X . Obs
SySc 645 Information Theory: Final Exam. Solutions 1. (30 points) A Hamming code with the matrix 10 H= 0 1 00 block length 7 can be characterized by 10101 1 0 0 1 1 01111
Each legal input codeword x satises
Hx = 0 (all arithmetic is understood to be mod 2
Information Theory and Coding Prof. Suhas Diggavi
EPFL Winter Semester 2009/2010 Handout # 23, Thursday, 17 December, 2009
Solutions: Homework Set # 6
Problem 1
(Cascade Network)
(a) We know that the capacity of the channel is equal to C = max I (X ; V ).
Solutions to Homework Set #4 Channel and Source coding 1. Lossless source coding with side information. Consider the lossless source coding with side information that is available at the encoder and decoder, where the source X and the side information Y a
IST 1 Introduction to Information Prof. M. Eros Homework Set #6 Due: Wednesday, May 16
Handout #35 Wednesday, 5/9/7
1. Determine if each of the following channels is weakly symmetric, or not. Find the capacity of each weakly symmetric channel. (a) For 0 <