Lecture - Huffman Coding

# Lecture - Huffman Coding - per symbol Entropy = = 2.08 ∑...

This preview shows pages 1–3. Sign up to view the full content.

EE 4780 Huffman Coding Example

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Bahadir K. Gunturk 2 Huffman Coding Example Suppose X is a source producing symbols; the symbols comes from the alphabet A={a1, a2, a3, a4, a5}. Suppose that the probability of each symbol is as follows: {0.4, 0.2, 0.2, 0.15, 0.05}. Form the Huffman tree: a1 a2 a3 a4 a5 0.4 0.2 0.2 0.15 0.05 0.2 0.4 0.6 1.0 0 1 0 1 0 1 0 1 Symbol | Probability | Codeword a1 0.4 0 a2 0.2 10 a3 0.2 110 a4 0.15 1110 a5 0.05 1111 Average codeword length = 0.4*1 + 0.2*2 + 0.2*3 + 0.15*4 + 0.05*4 = 2.2
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: per symbol Entropy = = 2.08 ∑ = i i i x P x P x H ) ( 1 log ) ( ) ( 2 Bahadir K. Gunturk 3 Huffman Coding Example Another possible tree with the same source is: a1 a2 a3 a4 a5 0.4 0.2 0.2 0.15 0.05 0.2 0.4 0.6 1.0 1 1 1 1 Symbol | Probability | Codeword a1 0.4 a2 0.2 100 a3 0.2 101 a4 0.15 110 a5 0.05 111 Average codeword length = 0.4*1 + 0.2*3 + 0.2*3 + 0.15*3 + 0.05*3 = 2.2 per symbol...
View Full Document

## This note was uploaded on 11/28/2011 for the course EE 4780 taught by Professor Staff during the Spring '08 term at LSU.

### Page1 / 3

Lecture - Huffman Coding - per symbol Entropy = = 2.08 ∑...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online