This preview shows pages 1–12. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Elements of Information Theory Second Edition Solutions to Problems Thomas M. Cover Joy A. Thomas September 22, 2006 1 COPYRIGHT 2006 Thomas Cover Joy Thomas All rights reserved 2 Contents 1 Introduction 7 2 Entropy, Relative Entropy and Mutual Information 9 3 The Asymptotic Equipartition Property 49 4 Entropy Rates of a Stochastic Process 61 5 Data Compression 97 6 Gambling and Data Compression 139 3 4 CONTENTS Preface The problems in the book, Elements of Information Theory, Second Edition, were chosen from the problems used during the course at Stanford. Most of the solutions here were prepared by the graders and instructors of the course. We would particularly like to thank Prof. John Gill, David Evans, Jim Roche, Laura Ekroot and Young Han Kim for their help in preparing these solutions. Most of the problems in the book are straightforward, and we have included hints in the problem statement for the difficult problems. In some cases, the solutions include extra material of interest (for example, the problem on coin weighing on Pg. 12). We would appreciate any comments, suggestions and corrections to this Solutions Manual. Tom Cover Joy Thomas Durand 121, Information Systems Lab Stratify Stanford University 701 N Shoreline Avenue Stanford, CA 94305. Mountain View, CA 94043. Ph. 4157234505 Ph. 6502102722 FAX: 4157238473 FAX: 6509882159 Email: cover@isl.stanford.edu Email: jat@stratify.com 5 6 CONTENTS Chapter 1 Introduction 7 8 Introduction Chapter 2 Entropy, Relative Entropy and Mutual Information 1. Coin flips. A fair coin is flipped until the first head occurs. Let X denote the number of flips required. (a) Find the entropy H ( X ) in bits. The following expressions may be useful: X n =0 r n = 1 1r , X n =0 nr n = r (1r ) 2 . (b) A random variable X is drawn according to this distribution. Find an efficient sequence of yesno questions of the form, Is X contained in the set S ? Compare H ( X ) to the expected number of questions required to determine X . Solution: (a) The number X of tosses till the first head appears has the geometric distribution with parameter p = 1 / 2, where P ( X = n ) = pq n1 , n { 1 , 2 , . . . } . Hence the entropy of X is H ( X ) = X n =1 pq n1 log( pq n1 ) =" X n =0 pq n log p + X n =0 npq n log q # =p log p 1qpq log q p 2 =p log pq log q p = H ( p ) /p bits. If p = 1 / 2, then H ( X ) = 2 bits. 9 10 Entropy, Relative Entropy and Mutual Information (b) Intuitively, it seems clear that the best questions are those that have equally likely chances of receiving a yes or a no answer. Consequently, one possible guess is that the most efficient series of questions is: Is X = 1? If not, is X = 2?...
View
Full
Document
This note was uploaded on 04/05/2011 for the course EE 5368 taught by Professor Staff during the Spring '08 term at UT Arlington.
 Spring '08
 Staff

Click to edit the document details