Lectures1-8 - University of Central Florida School of...

Info iconThis preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: University of Central Florida School of Electrical Engineering and Computer Science EEL-6532: Information Theory and Coding. Spring 2010 - dcm Lecture 2 - Wednesday January 13, 2000 Shannon Entropy In his endeavor to construct mathematically tractable models of communication Shannon concentrated on stationary and ergodic 1 sources of classical information. A stationary source of information emits symbols with a probability that does not change over time and an ergodic source emits information symbols with a probability equal to the frequency of their occurrence in a long sequence. Stationary ergodic sources of information have a finite but arbitrary and potentially long correlation time. In late 1940s Shannon introduced a measure of the quantity of information a source could generate [13]. Earlier, in 1927, another scientist from Bell Labs, Ralph Hartley had proposed to take the logarithm of the total number of possible messages as a measure of the amount of information in a message generated by a source of information arguing that the logarithm tells us how many digits or characters are required to convey the message. Shannon recognized the relationship between thermodynamic entropy and informational entropy and, on von Neumanns advice, he called the negative logarithm of probability of an event, entropy 2 . Consider an event which happens with probability p ; we wish to quantify the information content of a message communicating the occurrence of this event and we impose the condition that the measure should reflect the surprise brought by the occurrence of this event. An initial guess for a measure of this surprise would be 1 /p , the lower the probability of the event the larger the surprise. But this simplistic approach does not resist scrutiny; the surprise should be additive. If an event is composed of two independent events which occur with probabilities q and r then the probability of the event should be p = qr , but we see that: 1 p 6 = 1 q + 1 r . On the other hand, if the surprise is measured by the logarithm of 1 /p , then the additivity property is obeyed: log 1 p = log 1 q + log 1 r . Given a probability distribution i p i = 1 we see that the uncertainty is in fact equal to the average surprise : X i p i log 1 p i . 1 A stochastic process is said to be ergodic if time averages are equal to ensemble averages, in other words if its statistical properties such as its mean and variance can be deduced from a single, sufficiently long sample (realization) of the process 2 It is rumored that von Neumann told Shannon It is already in use under that name and besides it will give you a great edge in debates because nobody really knows what entropy is anyway [2]. The entropy is a measure of the uncertainty of a single random variable X before it is observed, or the average uncertainty removed by observing it. This quantity is called entropy due to its similarity to the thermodynamic entropy....
View Full Document

Page1 / 35

Lectures1-8 - University of Central Florida School of...

This preview shows document pages 1 - 3. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online