{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

EE_740_HW2

# EE_740_HW2 - EE 740 Homework 2 22 January 08 Edmund G...

This preview shows pages 1–4. Sign up to view the full content.

EE 740 Homework 2 22 January 08 Edmund G. Zelnio Wright State University —————————————————————————————————————————————————— Problem 2.12, Cover and Thomas Example of Joint Entropy. A fair coin is ﬂipped until the first head occurs. Let X denote the number of ﬂips required. The joint PDF is shown in Figure 1. 1/3 1/3 0 1/3 0 0 1 1 X Y Figure 1: Joint Probability Density Function a. Find H ( X ), H ( Y ) H ( X ) = x X p ( x ) log p ( x ) (1) H ( X ) = x ∈{ 0 , 1 } p ( x ) log p ( x ) H ( X ) = 2 3 · log 2 3 1 3 · log 1 3 = 0 . 9183 H ( Y ) = 1 3 · log 1 3 2 3 · log 2 3 = 0 . 9183 b. Find H ( X/Y ), H ( Y/X ) H ( X/Y ) = x X y Y p ( x, y ) · log p ( x/y ) (2) H ( X/Y ) = x ∈{ 0 , 1 } y ∈{ 0 , 1 } p ( x, y ) · log p ( x/y ) H ( X/Y ) = 1 3 · log 1 1 3 · log 1 2 0 · log 0 1 3 · log 1 2 = 2 3 H ( Y/X ) = x X y Y p ( x, y ) · log p ( y/x ) (3) H ( Y/X ) = 1 3 · log 1 2 1 3 · log 1 2 0 · log 0 1 3 · log 1 = 2 3 c. Find H ( X, Y ). H ( X, Y ) = x X y Y p ( x, y ) · log p ( x, y ) (4) H ( X, Y ) = 1 3 · log 1 3 1 3 · log 1 3 0 · log 0 1 3 · log 1 3 = 1 . 5850 d. Find H ( Y ) H ( Y/X ). From above H ( Y ) H ( Y/X ) = . 9183 . 6667 = . 2516. e. Find I ( X, Y ). I ( X, Y ) = x X y Y p ( x, y ) · log p ( x, y ) / ( p ( x ) · p ( y )) (5)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
H(X) H(Y) H(Y/X) H(X/Y) I(X,Y) H(X,Y) H(X)-H(X/Y)= I(X,Y) =H(Y)-H(Y/X) Figure 2: Venn Diagram of Entropy Related Quantities f. Draw a Venn Diagram of the above information quantities. Shown in Figure 2. —————————————————————————————————————————————————— Problem 2.13, Cover and Thomas Inequality. Show that ln x 1 1 /x for x > 0. We will show this by plotting the functions (Reference Figure 3) over two large and disparate regions illustrating that log(x) is larger. 10 -16 10 -15 10 -14 10 -13 10 -12 10 -11 10 -10 -10 16 -10 14 -10 12 -10 10 -10 8 -10 6 -10 4 -10 2 -10 0 log(x) 1 - 1/x (a) 10 0 10 2 10 4 10 6 10 8 10 10 10 -5 10 -4 10 -3 10 -2 10 -1 10 0 10 1 10 2 Log(x) 1 - 1/x (b) Figure 3: Logarithmic Plots Showing That Log(x) is larger. —————————————————————————————————————————————————— 2.14 Cover and Thomas Entropy of a Sum. Let X and Y be random variables that take on values x 1 , x 2 , x 3 , · · · , x r and y 1 , y 2 , y 3 , · · · y s , respectively. Let X + Y = Z . a. Show that H ( Z/X ) = H ( Y/X ). Hence, p ( Z = z/X = x ) = P ( Y = z x/X = x ). H ( Z/X ) = x X p ( x ) H ( Z/X = x ) (6) ( ) ( Z /X ) l ( Z /X ) (7)
= x X p ( x ) y Y p ( Y = z x/X = x ) · log p ( Y = z x/X = x ) = x X p ( x ) H ( Y/X = x ) = H ( Y/X ) Now if X and Y are independent, I ( X, Y ) = x,y p ( x, y ) log p ( x, y ) p ( x ) · p ( y ) = 0 (8) 0 = H ( Y ) H ( Y/X ) H ( Y/X ) = H ( Y ) Now, since H ( Z ) H ( Z/X ), H ( Z/X ) = H ( Y/X ), and H ( Y/X ) = H ( Y ); we have H ( Z ) H ( Y ). Similarly, with a dual analysis we could establish that H ( Z ) H ( X ). b. Give an example of (necessarily dependent) random variables in which H ( X ) > H ( Z ) and H ( Y ) > H ( Z ).

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}