This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: X = { 1 , 2 , 3 , 4 } and that the probabilities of the four possible outcomes are p = { 1 2 , 1 8 , 1 4 , 1 8 } (a) (2 points) Determine H ( X ). (b) (2 points) Let q = { 1 8 , 1 4 , 1 2 , 1 8 } be probabilities associated with a random variable Y also deﬁned on the set { 1 , 2 , 3 , 4 } . Compute H ( Y ). (c) (2 points) Find the relative entropy between p and q , (i.e., D ( p k q ). Also ﬁnd D ( q k p ). (d) (6 points) Find a Huﬀman code for X . (e) (2 points) Find the expected codeword length for the Huﬀman code. 1 (f) (6 points) Now suppose that q had been the true distribution, but the Huﬀman code was designed using p as in part d. Find the expected codeword length. What is the cost for not using the true distribution q to design the code? Problem 5: Solve the questions 4.28 from Chapter 4 of the textbook (the second edition). 2...
View
Full Document
 Fall '08
 Staff
 Information Theory, Probability theory, huffman code, GEORGIA INSTITUTE OF TECHNOLOGY School of Electrical and Computer Engineering ECE, Information Theory Assigned

Click to edit the document details