{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

20105ee211A_1_homework7

# 20105ee211A_1_homework7 - a Provide a formula for the...

This preview shows page 1. Sign up to view the full content.

1 EE 211A Digital Image Processing I Fall Quarter, 2010 Handout 21 Instructor: John Villasenor Homework 7 Due: Tuesday, 30 November 2010 1. The output of a binary source is to be coded in blocks of M samples. If the successive outputs are independent and identically distributed with p=0.95 (for a 0), find the Huffman codes for M=1, 2 and calculate their efficiencies. 2. Consider an information source that produces k possible symbols, each with equal probability.
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: a) Provide a formula for the entropy (in bits) of this information source. Consider k in the range 2 n ≤ k ≤ 2 n+1 . As is customary, let H denote the entropy. Let L denote the actual average length of the Huffman code for this source. b) What is the minimum possible value of L-H, and for what value(s) of k will it occur in the range specified above? c) Provide a formula for the efficiency of the Huffman code with respect to the variables k and n....
View Full Document

{[ snackBarMessage ]}

Ask a homework question - tutors are online