ex2 - Introduction to Information Theory(67548 December 5...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Introduction to Information Theory (67548) December 5, 2008 Assignment 2 Lecturer: Prof. Michael Werman Due: Sunday, Dec. 21, 2008 Note: Unless specified otherwise, all entropies and logarithms should be taken with base 2 . Problem 1 AEP and Source Coding A discrete memoryless source emits a sequence of statistically independent letters (wither ’x’ or ’y’) with probabilities p ( x ) = 0 . 005 and p ( y ) = 0 . 995. The letters are taken 100 at a time and a binary codeword (composed of 0’s and 1’s) is provided for every sequence of 100 letters containing three of fewer x’s. 1. Assuming that all codewords are the same length, find the minimum length required to provide codewords for all sequences with three or fewer x’s. 2. Calculate the probability of observing a source sequence for which no codeword has been assigned. 3. Use Chebyshev’s inequality to bound the probability of observing a source sequence for which no codeword has been assigned. Compare this bound with the actual probability computed in part (b). 4. Suppose now that instead of coding the letters 100 at a time, we code each digit separately using a binary code....
View Full Document

{[ snackBarMessage ]}

Page1 / 2

ex2 - Introduction to Information Theory(67548 December 5...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online