Solution_to_Information_Theory

Solution_to_Information_Theory - Elements of Information...

Info iconThis preview shows pages 1–12. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Elements of Information Theory Second Edition Solutions to Problems Thomas M. Cover Joy A. Thomas September 22, 2006 1 COPYRIGHT 2006 Thomas Cover Joy Thomas All rights reserved 2 Contents 1 Introduction 7 2 Entropy, Relative Entropy and Mutual Information 9 3 The Asymptotic Equipartition Property 49 4 Entropy Rates of a Stochastic Process 61 5 Data Compression 97 6 Gambling and Data Compression 139 3 4 CONTENTS Preface The problems in the book, Elements of Information Theory, Second Edition, were chosen from the problems used during the course at Stanford. Most of the solutions here were prepared by the graders and instructors of the course. We would particularly like to thank Prof. John Gill, David Evans, Jim Roche, Laura Ekroot and Young Han Kim for their help in preparing these solutions. Most of the problems in the book are straightforward, and we have included hints in the problem statement for the difficult problems. In some cases, the solutions include extra material of interest (for example, the problem on coin weighing on Pg. 12). We would appreciate any comments, suggestions and corrections to this Solutions Man-ual. Tom Cover Joy Thomas Durand 121, Information Systems Lab Stratify Stanford University 701 N Shoreline Avenue Stanford, CA 94305. Mountain View, CA 94043. Ph. 415-723-4505 Ph. 650-210-2722 FAX: 415-723-8473 FAX: 650-988-2159 Email: cover@isl.stanford.edu Email: jat@stratify.com 5 6 CONTENTS Chapter 1 Introduction 7 8 Introduction Chapter 2 Entropy, Relative Entropy and Mutual Information 1. Coin flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips required. (a) Find the entropy H ( X ) in bits. The following expressions may be useful: X n =0 r n = 1 1-r , X n =0 nr n = r (1-r ) 2 . (b) A random variable X is drawn according to this distribution. Find an efficient sequence of yes-no questions of the form, Is X contained in the set S ? Compare H ( X ) to the expected number of questions required to determine X . Solution: (a) The number X of tosses till the first head appears has the geometric distribution with parameter p = 1 / 2, where P ( X = n ) = pq n-1 , n { 1 , 2 , . . . } . Hence the entropy of X is H ( X ) =- X n =1 pq n-1 log( pq n-1 ) =-" X n =0 pq n log p + X n =0 npq n log q # =-p log p 1-q-pq log q p 2 =-p log p-q log q p = H ( p ) /p bits. If p = 1 / 2, then H ( X ) = 2 bits. 9 10 Entropy, Relative Entropy and Mutual Information (b) Intuitively, it seems clear that the best questions are those that have equally likely chances of receiving a yes or a no answer. Consequently, one possible guess is that the most efficient series of questions is: Is X = 1? If not, is X = 2?...
View Full Document

This note was uploaded on 04/05/2011 for the course EE 5368 taught by Professor Staff during the Spring '08 term at UT Arlington.

Page1 / 164

Solution_to_Information_Theory - Elements of Information...

This preview shows document pages 1 - 12. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online