lecture9

lecture9 - Lecture Outline Source Encoding and Huffman...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Lecture Outline Source Encoding and Huffman Coding Reading: Section 6.3.1. Part of 6.1 (pages 267-271) Material covered: 1. What is information? Example: M= {It rains today in Phoenix, It does not rain today in Phoenix} The fact that Phoenix rains for less than 30 days of the year implies there is not much information in this statement ( << 1 bit) Self-information of a message: If message m occurs with probability m p , then self information = m p 2 log - bits. Source Information (Source Entropy) = self information averaged over all messages m m m p p M H log ) ( - = Example : Entropy of “being rainy” in Phoenix 425 . 0 365 30 log 365 30 365 335 log 365 335 log ) ( 2 2 = - - = - = m m m p p M H bit Example : Entropy of English: M = English alphabet. If English letters are equally likely and memoryless (it is not!), then 7 . 4 26 1 log 26 1 26 log ) ( 2 = × - = - = m m m p p M H bits per letter Entropy of English = 2.6 bits/letter, found by a guessing game by Shannon.
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 11/01/2009 for the course EEE 455 taught by Professor Hui during the Spring '09 term at ASU.

Page1 / 3

lecture9 - Lecture Outline Source Encoding and Huffman...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online