This preview shows pages 1–2. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Department of Electrical and Computer Engineering The Johns Hopkins University 520.137 Introduction to Electrical and Computer Engineering Fall 2009 Homework Assignment VII Reading Assignment: Kuc Chapter 7 1. Lets explore the encoding of a binary source with skewed statistics: P [ 1 ] = 1 8 and P [ ] = 7 8 . (a) What is the entropy H of this bit-stream? (b) What is the average code-word length for the fixed-length code in this case? What is the average code-word length if we apply Huffman code directly? (c) One solution to this problem is to combine two consecutive binary symbols to form a new symbol. Find the probabilities of occurrence for the new symbols: P [ 00 ] ,P [ 01 ] ,P [ 10 ] ,P [ 11 ]. (d) What is the entropy of the modified source in Part c ? How does it compare to the original entropy? (e) Construct the Huffman code for the modified source of Part c . Show the Huffman tree as well as the Huffman code table....
View Full Document
This note was uploaded on 02/02/2010 for the course ENGINEERIN 520.101 taught by Professor Tracdtran during the Winter '06 term at Johns Hopkins.
- Winter '06