{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

sol1[1] - ECE 255AN Fall 2011 Homework set 1 solutions 2(a...

Info icon This preview shows pages 1–3. Sign up to view the full content.

View Full Document Right Arrow Icon
ECE 255AN Fall 2011 Homework set 1 - solutions 2. (a) Optimal code is 1 , 01 , 001 , . . . , 0 n - 2 1 , 0 n - 1 1 , 0 n - 1 0. (b) H ( P ) = X i =1 1 2 i log 2 i = X i =1 i 2 i = 1 2 (1 - 1 2 ) 2 = 2 , and the optimal code is 0 , 10 , 110 , . . . . 3. False. Let X be non-constant and Y = - X . 4. A decreases with increasing α . For α = 1, we approximate N n =2 1 n log n by R N 2 1 x log x dx log log N . Thus, the sum diverges. For α > 1, A can be approximated by R 1 1 x (log x ) α dx which is finite. Using this, for part (b) entropy is finite for α > 2, else infinite. Similarly, for β > 1, B converges, otherwise it diverges and for β > 2 the entropy is finite else infinite. 5. False. Consider H def = H ( p, 1 / 3 , 2 / 3 - p ). For p = 1 / 3, H = log 3 > 1, while for p = 2 / 3, H = h (2 / 3) < 1. Hence for some 1 / 3 < p < 2 / 3, H = 1, namely d H e = H . But since at least one of the probabilities, 1 / 3, is not a power of 2, the expected length will be > H , and hence > d H e . 6. h ( p ) + 1 . 5(1 - p ). 7. Consider any distributions, ( p 1 , p 2 , . . . ) and ( q 1 , q 2 , . . . ). Simplistically speaking, we need to show that: 1 2 X i 1 ( p i log p i + q i log q i ) X i 1 p i + q i 2 log p i + q i 2 It suffices if: p i log p i + q i log q i - ( p i + q i ) log p i + q i 2 0 and by concavity of logarithm, - LHS = p i p i + q i log p i + q i 2 p i + q i p i + q i log p i + q i 2 q i log( 1 2 + 1 2 ) = 0 1
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Solutions to Chapter 2 problem 21. Markov inequality applied to entropy . P ( p ( X ) < d ) log 1 d = X x : p ( x ) <d p ( x ) log 1 d (1) X x : p ( x ) <d p ( x ) log 1 p ( x ) (2) X x p ( x ) log 1 p ( x ) (3) = H ( X ) (4) Solutions to Chapter 5 problems 14. Huffman code. (a) The Huffman tree for this distribution is Codeword 00 x 1 6 / 21 6 / 21 6 / 21 9 / 21 12 / 21 1 10 x 2 5 / 21 5 / 21 6 / 21 6 / 21 9 / 21 11 x 3 4 / 21 4 / 21 5 / 21 6 / 21 010 x 4 3 / 21 3 / 21 4 / 21 0110 x 5 2 / 21 3 / 21 0111 x 6 1 / 21 (b) The ternary Huffman tree is Codeword 1 x 1 6 / 21 6 / 21 10 / 21 1 2 x 2 5 / 21 5 / 21 6 / 21 00 x 3 4 / 21
Image of page 2
Image of page 3
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern