ex_4 - Introduction to Information Theory (67548) January...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Introduction to Information Theory (67548) January 12, 2009 Assignment 4: Gaussian Channel and Differential Entropy Lecturer: Prof. Michael Werman Due: Sunday, January 25, 2009 Note: Unless specified otherwise, all entropies and logarithms should be taken with base 2 . Problem 1 Differential Entropy 1. Let X be a continuous random variable, with entropy h ( X ). Let Y be another continuous random variable, defined via the relation Y = aX + c where a,c are scalars and a 6 = 0. Find an expression for h ( Y ) as a function of h ( X ) ,a,c . 2. Prove the chain rule for differential entropy: if X 1 ,...,X n are continuous random variables, it holds that h ( X 1 ,...,X n ) = n X i =1 h ( X i | X 1 ,...,X i - 1) . Conclude that h ( X 1 ,...,X n ) n i =1 h ( X i ). Problem 2 Discrete Input, Continuous Output Consider a channel whose input alphabet is X = { 0 , ± 1 , ± 2 } , and whose output is Y = X + Z , where Z is uniformly distributed over the interval [ - 1 , 1]. Thus, the input of the channel is a discrete random
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 2

ex_4 - Introduction to Information Theory (67548) January...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online