Entropy_and_Source_Coding_Theorem-cse802

Entropy_and_Source_Coding_Theorem-cse802 - Introduction to...

Info iconThis preview shows pages 1–14. Sign up to view the full content.

View Full Document Right Arrow Icon
Click to edit Master subtitle style 4/1/11 Introduction to Information Theory Dr. Adnan K. Kiani
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/1/11 In this Lecture We will go through: Entropy and some related properties Source Coding Nyquist-Shannon Sampling Theorem Shannon’s Source Coding Theorem
Background image of page 2
4/1/11 Information Theory “The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point." Claude Shannon, 1948
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/1/11 Information Theory Information theory deals with the problem of efficient and reliable transmission of information It specifically encompasses theoretical and applied aspects of - coding, communications and communications networks - complexity and cryptography - detection and estimation - learning, Shannon theory, and stochastic processes
Background image of page 4
4/1/11 Entropy of an information source
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/1/11 A typical Communication System Transmitt er Chann el Receiv er Informati on source Information sink
Background image of page 6
4/1/11 Information Source Source Output x g { finite set of messages} Example: binary source: x { 0, 1 } with P( x = 0 ) = p; P( x = 1 ) = 1 - p M-ary source: x {1,2, s , M} with Pi =1.
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/1/11 Information Source Discrete finite ensemble: a,b,c,d in general: k binary digits specify 2k messages Analogue signal: (problem is sampling speed) 1) sample and 2) represent sample value in binary.
Background image of page 8
4/1/11 What is an information An information source produces a message or a sequence of messages to be communicated to a destination or receiver On a finer granularity, an information source produces symbols to be communicated to the destination In this lecture, we will mainly focus on discrete sources i.e., sources that produce discrete symbols from a predefined alphabet 99
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/1/11 What is an information Intuitively, an information source having more symbols should have more information For instance, consider a source, say S1, that wants to communicate its direction to a destination using the following symbols: North ( N ), South ( S ), East ( E ), West ( W ) Another source, say S2, can communicate its direction using: North ( N ), South ( S ), East ( E ), West ( W ), 1010
Background image of page 10
4/1/11 Minimum number of bits for a Before we formally define information , let us try to answer the following question: What is the minimum number of bits/symbol required to communicate an information source having n symbols ? 1111
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4/1/11 Minimum number of bits for a What is the minimum number of bits/symbol required to communicate an information source having n symbols? A simple answer is that log2( n ) bits are required to represent n symbols 2 symbols: 0, 1 4 symbols: 00, 01, 10, 11 8 symbols: 000, 001, 010, 011, 100, 101, 110, 111 1212
Background image of page 12
4/1/11 Minimum number of bits for a Let there be a source X that wants to communicate information of its direction to a destination i.e., n =4 symbols: North ( N ), South ( S ), East ( E ), West ( W
Background image of page 13

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 14
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/01/2011 for the course CSE 802 taught by Professor Dradnankhalid during the Spring '11 term at College of E&ME, NUST.

Page1 / 96

Entropy_and_Source_Coding_Theorem-cse802 - Introduction to...

This preview shows document pages 1 - 14. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online