info_theory_course_reader_stanford

info_theory_course_reader_stanford - Tsachy Weissman...

Info icon This preview shows pages 1–8. Sign up to view the full content.

View Full Document Right Arrow Icon
Tsachy Weissman Information Theory – EE376A Course reader Winter 2010 Springer
Image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 2
Preface These notes form an outline of the core of the material I plan to cover in the course. Most of the theorems, lemmas, and auxiliary results are stated without their proofs. In the lectures, I will follow these notes, filling in proofs, details, emphasis, and intuition. Time permitting we will cover some additional topics, such as channels with feedback and the colored Gaussian channel, which are not included here. There will be some handouts on these topics if and when relevant. I wish us lots of fun during this quarter, as we explore the fundamentals of this exciting field. Stanford January 2010 Tsachy Weissman
Image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 4
Contents 1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.1 What is Information Theory ? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 1.2 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2-A Binary Source and Channel . . . . . . . . . . . . . . . . . . . . . . . . . 2 1.2-B Lossless Compression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 1.2-C Lossy Compression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.2-D AWGN Channel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 1.3 Expectations from the Course . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 2 Information Measures and some of their Properties . . . . . . . . 9 2.1 Axiomatic Derivation of an Uncertainty Measure . . . . . . . . . . . . 9 2.2 Additional Information Measures and some Properties . . . . . . . 11 3 Fixed-Length Lossless Source Coding . . . . . . . . . . . . . . . . . . . . . . 13 3.1 Fixed-Length “Near Lossless” Coding . . . . . . . . . . . . . . . . . . . . . . 13 3.2 The Asymptotic Equipartition Property (AEP) . . . . . . . . . . . . . 14 3.3 A Direct and a Converse Theorem for Block Coding . . . . . . . . . 15 4 Variable Length Lossless Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . 19 4.1 Dyadic Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20 4.2 General Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23 4.3 Huffman Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24 4.4 Other UD Code Constructions . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 4.5 Kraft-McMillan Inequality and a Converse Result . . . . . . . . . . . 25 5 Channel Coding and Capacity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 5.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28 5.1-A Coding with a Cost Constraint . . . . . . . . . . . . . . . . . . . . . . 29 5.2 Memoryless Channels . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30 5.3 Main Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 5.4 DMC Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
Image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
VIII Contents 5.5 On Mutual Information and Differential Entropy . . . . . . . . . . . . 32 5.6 Capacity of the AWGN Channel . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 6 Proof of the Channel Capacity Theorem . . . . . . . . . . . . . . . . . . . 35 6.1 Proof of Converse Part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35 6.2 Proof of Direct Part . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 6.2-A A Joint AEP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36 6.2-B A Random Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 7 Lossy Source Coding and the Rate Distortion Function . . . . 39 7.1 Motivation and Problem Formulation . . . . . . . . . . . . . . . . . . . . . . 39 7.2 Main Result . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41 7.3 Properties of R ( D ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 7.4 Example: Binary source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43 7.5 Example: Gaussian Source . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44 8 The Method of Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 8.1 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47 8.2 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48 8.3 Sanov’s Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 8.4 Application to Universal Lossless Coding . . . . . . . . . . . . . . . . . . . 51 8.5 Joint, Conditional, and Strong Typicality . . . . . . . . . . . . . . . . . . 52 8.5-A Joint and Conditional Typicality . . . . . . . . . . . . . . . . . . . . 52 8.5-B Strong Typicality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54 9 Proof of the Rate Distortion Theorem . . . . . . . . . . . . . . . . . . . . . 57 9.1 Proof of Converse Part: A Counting Argument . . . . . . . . . . . . . 57 9.2 Proof of Direct Part: Random Coding . . . . . . . . . . . . . . . . . . . . . . 59 10 Another Proof of Converse in Channel Coding . . . . . . . . . . . . . 63 11 Joint Source Channel Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 11.1 Problem Formulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67 11.2 The Separation Theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68 11.3 Examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 11.3-A Binary Source and BSC . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69 11.3-B Gaussian Source and Channel . . . . . . . . . . . . . . . . . . . . . . . 70 Further reading . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75 Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Image of page 6
1 Introduction 1.1 What is Information Theory ? Information theory is the science of compression, storage, and transmission of information. It is one of the few scientific disciplines fortunate to have a precise birthday: The publication of Claude Shannon’s 1948 paper entitled “A Mathematical Theory of Communication”.
Image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 8
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern