Introduction to Information Theory

# Introduction to Information Theory - I ntroduction to I...

This preview shows pages 1–7. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: I ntroduction to I nformation Theory Common Sense measure of I nformation Tomorrow the sun will rise in the east. United states invades Cuba. Cuba invades United States. I nformation and Probability When P 1, then I When P 0, then I infinte I ~log(1/p) (since smaller probability gives bigger I ) Engineering measure of I nformation Sending two equiprobable messages m1 and m2 use 0 and 1 Sending four equiprobable messages m1, m2, m3, m4 Minimum if two binary digits required per message: use 00, 01, 10, 11 (twice as much transmission time required as in the first case) Sending eight equiprobable messages m1, m2, , m8 Minimum of three binary digits required per message: use 000, 001, , 111 I n general, to send n equiprobable messages log 2 n binary digits required per message. So each of n equiprobable messages needs log 2 n = log 2 (1/p) bits per message Therefore from Engineering point of view, information contained in a message is proportional to log 2 (1/p) ....
View Full Document

## This note was uploaded on 12/18/2010 for the course ME 22 taught by Professor Rashid during the Spring '10 term at Superior University Lahore.

### Page1 / 11

Introduction to Information Theory - I ntroduction to I...

This preview shows document pages 1 - 7. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online