# Chapter 14 - 1 Chapter 14 Entropy Teacher: è ! ª Office:...

This preview shows pages 1–16. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 1 Chapter 14 Entropy Teacher: è ! ª Office: 805 Tel: ext. 31822 Email: [email protected] 2 14-1 Introduction Probability of an event A , denoted as ( ) P A , w event AP P uncertainty1 measure1 If ( ) 1, no unceretainty at all If ( ) 0, P A A P A A = ⇒ = h ( ) 0.5 P A = A | * w 3 A P uncertainty1 maximum 3 EntropyH w 3 experiment S 1 partition U 1 uncertainty measure Probability space S P elementary events (outcomes) 1 Partition { } 1 , , N U A A = 1 2 N A A A S ∪ ∪ ∪ = i j A A φ ∩ = Entropy of U is defined as 1 1 ( ) log log N N H U p p p p = - - - , where ( ) i i p P A = Information = uncertainty1 | 3w 3 H w 3 information | 3w 3 information 4 5 Ex. 4-1 1 fair-die 1 experiment 1 (a) 1 { , } U even odd = { } { } 0.5 P even P odd = = Hence 1 1 1 1 ( ) log log log 2 2 2 2 2 H U = -- = (b) 1 V P elementary events { } i V f = 8 w 3 1 { } 6 i P f = ( ) log 6 H V = Entropy 1 log31 uncertainty about V assuming U (conditional entropy)1 | 3 even 1 oddk w uncertainty 1 log3. Ex. 14-2 For coin experiment1 { } P h p = ( ) log (1 )log(1 ) ( ) H V p p p p h p = ---- ≡ Fig.14-2 1 ( ) h p verse pP p =0.5 1 maximum1 p= 0 1 1 1 01 6 7 14-2 Basic Concepts Let { } { } 1 , , N i U A A A = = 1 partition of SP i AP events (1) w events1 binary , In this case, we usually denote U by { } , U A A = A is called the complement of A (2) 1 UP events 1 elementary events { } i ξ denote it by V and call it element partition 8 (3) A refinement of U is a partition B of S , such that for j B B 2200 ∈ j B P i A P subset1 denoted by B U P , 1 B P U P events: events (see Fig.14-4)1 h B U P iff j i B A ⊂ A common refinement 1 partitions 1 refinement (4) The product of U and B is a partition1 elements 1 i j AB (1 )1 denoted by U B ⋅ U B ⋅ U P B P common refinement 9 10 11 Properties V U P for any U U B B U ⋅ = ⋅ , ( ) ( ) U B C U B C ⋅ ⋅ = ⋅ ⋅ If 1 2 3 U U U P , then 1 3 U U P If B U P , then U B B ⋅ = 12 Entropy Definition: 1 1 1 ( ) ( log log ) ( ) N N N i i H U p p p p p ϕ = = - + + = ∑ where ( ) i i p P A = ( ) log p p p ϕ = - h ( ) p ϕ ≥ for 0 1 p ≤ ≤ , ∴ ( ) H U ≥ ( ) H U = if 1 1 i p = i p = 13 For binary partition1 ( ) p P A = ( ) 1 P A p = - ∴ ( ) log (1 )log(1 ) ( ) H U p p p p h p = - - - - = If 1 1 2 N p p p p = = = = then 1 1 1 1 ( ) log log log H U N N N N N = - - - = In this case, if 2 m N = , then ( ) H U m = . (Here, the base of log operation is 2) Inequality1 ( ) log p p p ϕ = - convex function 1 2 1 2 1 2 ( ) ( ) ( ) ( ) ( ) p p p p p p ϕ ϕ ϕ ϕ ε ϕ ε + < + < + + - where 1 1 2 2 p p p p ε ε < + ≤ - < . !≡♠ 3 ( ) p ϕ (see Fig.14-6) 14 15 h inequalityw 3 (1) If { } 1 2 , , , N U A A A = and { } 2 , , , , a b N B B B A A = where a B 1 b B 1 1 A split 1 1 1 events1 (see Fig.14-7) 1 ( ) ( ) H U H B ≤ , 1 1 2 1 1 2 ( ) ( ) ( ), ( ( )) ( ( )) ( ( )) P A P B P B P A P B P B ϕ ϕ ϕ = + ≤ + (convex function 1 ) 16...
View Full Document

## This note was uploaded on 07/21/2009 for the course CM EM5102 taught by Professor Sin-horngchen during the Fall '08 term at National Chiao Tung University.

### Page1 / 56

Chapter 14 - 1 Chapter 14 Entropy Teacher: è ! ª Office:...

This preview shows document pages 1 - 16. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online