This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: University of Illinois at UrbanaChampaign ECE 563: Information Theory Fall 2008 Midterm 2 Wednesday, November 19, 2008 Name: This is a closedbook exam. You may consult both sides of two sheets of notes, typed in font size 10 or equivalent handwriting size. Calculators, laptop computers, Palm Pilots, twoway email pagers, etc. may not be used. Write your answers in the space provided. Please show all of your work. Answers without appropriate justification will receive very little credit. Score: 1. (20 points) 2. (50 points) 3. (30 points) Total. (100 points) 1 1. Consider optimally compressing a source X X where each x X is equally likely. a) Suppose X = 2 k and the optimal Huffman code tree is constructed. Consider a new source Y , also equally likely, with Y = 2 k + 1. How do the huffman codes differ in the two cases? What is the expected length of the latter? (Feel free to think graphically). b) Give the exact expected Huffman code length for the scenario where X = 100. 2 2. Consider trying to communicate a message in W W = ' 1 ,..., 2 nR at rate R across a memoryless noisy channel, with feedback. We assume W is uniformly distributed over W ....
View
Full
Document
 Spring '11
 Kelly

Click to edit the document details