S08.hw04 - ECE 320 Networks and Systems Spring 20072008...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
ECE 320 Networks and Systems Spring 2007–2008 Problem Set 4 Due March 3, 2008 1. This problem considers source encoding for a source which generates 4 diFerent symbols. (a) The 4 symbols and their probabilities are shown in the following table. s i Pr( s i ) e .60 b .29 y .07 ! .04 Determine the HuFman code. What is the expected number of binary code symbols per 4-ary source symbol? (b) If the probabilities in Problem 1a were diFerent, the following HuFman code might result: s i code word e 111 b 110 y 10 ! 0 (This code is not the answer to Problem 1a). Using this code, decode the message 110101110 2. The entropy ( H d ) of a probability mass function ( p k ) is de±ned by H d ( p ) = - n X k =1 p k log d p k where log d is logarithm to base d . Because the entropy describes the average length of a code word needed to encode realizations from the pmf p , it is important to determine what is the most unfavorable pmf p , i.e., the pmf that makes H d ( p ) as large as possible.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 3

S08.hw04 - ECE 320 Networks and Systems Spring 20072008...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online