ex4_sol - Introduction to Information Theory (67548)...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Introduction to Information Theory (67548) January 12, 2009 Assignment 4: Gaussian Channel and Differential Entropy Lecturer: Prof. Michael Werman Due: Sunday, January 25, 2009 Note: Unless specified otherwise, all entropies and logarithms should be taken with base 2 . Problem 1 Differential Entropy 1. If Y = aX + c , we have that f Y ( y ) = 1 | a | f X ( y - c a ) , where f X ( · ) , f Y ( · ) are the density functions of X, Y respectively. Therefore, h ( Y ) = - Z y = -∞ f Y ( y ) log( f Y ( y )) dy = - Z y = -∞ 1 | a | f X ± y - c a ² log ± 1 | a | f X ± y - c a ²² dy = - Z y = -∞ 1 | a | f X ± y - c a ² log ± f X ± y - c a ²² dy - Z y = -∞ 1 | a | f X ± y - c a ² log ± 1 | a | ² dy = - Z z = -∞ f X ( z ) log( f X ( z )) dz - Z z = -∞ f X ( z ) log ± 1 | a | ² dz = H ( X ) + log( | a | ) , where the shift from the third to the fourth line was performed with a change of variables z = ( y - c ) /a . So we see that shifting a random variable X by a constant c does not change its differential entropy, while multipliying by a constant a changes its differential entropy by log( | a | ). 2. The chain rule for differential entropy simply follows from the definitions (try it!), and so is the resulting conclusion. Problem 2
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 12/10/2009 for the course CS 67543 taught by Professor Michaelwerman during the Spring '08 term at Hebrew University of Jerusalem.

Page1 / 3

ex4_sol - Introduction to Information Theory (67548)...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online