Chapter 14_English - Chapter14 Entropy 14-1 Introduction...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter14 Entropy 14-1 Introduction The probability of an event A , denoted as ( ) P A , can be interpreted as an uncertainty measure of the occurrence of the event. If ( ) 1, we are definitely sure that will occur no unceretainty at all If ( ) 0, we are definitely sure that will not occur P A A P A A = = On the other hand, for ( ) 0.5 P A = , the probabilities that A will occur and that A will not occur are equal. We therefore can say that the uncertainty of the event A is maximum. Entropy is defined as an uncertainty measure of a partition U of the probability space of an experiment S . The probability space S is the set of all elementary events (or outcomes). For a partition { } 1 , , N U A A = L , 1 2 N A A A S = L and i j A A φ = . Entropy of U is defined as 1 1 ( ) log log N N H U p p p p = --- L , where ( ) i i p P A = . Information = uncertaintyh It is a viewpoint that the occurrence of an uncertain event brings more information while a certain event (either occur or not occur) carries no information. Ex. 4-1: Experiment of tossing fair-die: (a) Let { , } U even odd = . Obviously, { } { } 0.5 P even P odd = = . Hence 1 1 1 1 ( ) log log log2 2 2 2 2 H U = -- = (b) Let V be the set of elementary events { } i V f = . Obviously, 1 { } 6 i P f = . Then, ( ) log6 H V = . The difference of their entropy is log3. It is the uncertainty about V assuming that U is known. This means that if we know that the outcome is even (or odd), then the uncertainty of outcome reduces to log3. It is known as conditional entropy . Ex. 14-2: For a coin experiment, { } P h p = . Then ( ) log (1 )log(1 ) ( ) H V p p p p h p = ---- C Fig.14-2 displays ( ) h p verse p . The maximum occurs at p =0.5. It is 0 at p= 0 and p =1. 14-2 Basic Concepts Let { } { } 1 , , N i U A A A = = L be a partition of S , where , 1, , , i A i N = L are events. (1) If there are only two events, we call it a binary case. Usually, U is denoted as { } , U A A = , were A is called the complement of A . (2) If the events of U are all elementary events { } i ξ , we denote it by V and call it element partition. (3) A refinement of U is a partition B of S , such that for j B B 2200 , j B is a subset of some event i A . It is denoted by B U p . In other words, some events i A are divided into sub-events contained in B . (4) The product of two partitions U and B is a partition which contains all intersection i j A B of their elements, and is denoted by U B P . U B P is the largest common refinement of U and B . Properties V U p is true for any U . U B B U = If 1 2 3 U U U p p , then 1 3 U U p . If B U p , then U B B = . Entropy Definition: 1 1 1 ( ) ( log log ) ( ) N N N i i H U p p p p p ϕ = = - + + = L where ( ) i i p P A = and ( ) log p p p ϕ = - ....
View Full Document

This note was uploaded on 07/21/2009 for the course CM EM5102 taught by Professor Sin-horngchen during the Fall '08 term at National Chiao Tung University.

Page1 / 18

Chapter 14_English - Chapter14 Entropy 14-1 Introduction...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online