ch3 - 1 EE670 Information Theory and Coding Class home page...

Info icon This preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon
1
Image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
2 EE670 Information Theory and Coding Class home page: http://koala.ece.stevens-tech.edu/~utureli/EE670/ Class information, assignments and other links. Text Elements of Information Theory, Thomas Cover and Joy A Thomas, Wiley, 1991. Entropy, Data Compression, Rate Distortion Theory Ch. 2- 3 HW Due Feb 3: 2.5,2.11,2.19,2.23,3.3
Image of page 2
3 Entropy Entropy (uncertainty, self information) of a discrete R.V. X: i x i i x p p x p x p x p x p X H 1 log )) ( log( ) ( ) ( 1 log ) ( ) ( = = = In the logarithm, when base 2 is used: information measured in bits (for discrete R.Vs) When base e is used: information measured in nats (for continuous R.V.s)
Image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
4 Entropy Properties Shift and Scale Invariant Depends on the probability of outcomes not on the values of outcomes Semi positive H(X) >=0 since log (p(x)) <= 0 when p(x) <= 1 Other: Maximum entropy for R.V. is when outcomes equally likely Source coding will provide a justification for the definition of entropy as the uncertainty of a R.V.
Image of page 4
5 Joint and Conditional Entropy Joint Entropy Examples: 1. Two fair coin tosses, P(X=x)=1/2, P(Y=y)=1/2 Independent P(X=x,Y=y)=1/2x1/2=1/4 H(X,Y)= 4 x ¼ log 4 = 4 x ¼ * 2 = 2 bits! 2. If X, Y independent P(x,y)=P(x)P(y) H(X,Y)=H(X)+H(Y) = = ) , ( 1 log ) , ( 1 log ) , ( ) , ( ) , ( , y x p E y x p y x p Y X H y x p y x
Image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
6 Independent RVs X, Y ) ( ) ( ) , ( ) ( ) ( ) ( ) ( ) , ( ) ( ) ( ) ( ) ( ) , ( ) ( 1 log ) ( ) ( ) ( 1 log ) ( ) ( ) , ( ) ( 1 log ) ( ) ( ) ( 1 log ) ( ) ( ) , ( ) ( ) ( 1 log ) ( ) ( ) , ( 1 log ) , ( 1 log ) , ( ) , ( , , , , Y H X H Y X H x p Y H y p X H Y X H Y H x p X H y p Y X H y p y p x p x p x p y p Y X H y p y p x p x p y p x p Y X H y p x p y p x p y x p y x p y x p Y X H x y x y xy x x y y x y x y x y x + = + = + = + = + = = =
Image of page 6
7 Joint Entropy The joint entropy H(X,Y) can also be expressed as: [ ] [ ] [ ] [ ] ) ( ) ( )] ( log [ ) ( log ) , ( )) ( log( ) ( (log ) ( ) ( log ) , ( log ) , ( Y H X H y p E x p E Y X H y p x p E y p x p E y x p E Y X H + = + = + = = =
Image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
8 Conditional probability and Bayes’ Rule The knowledge that a certain event occurred can change the probability that another event will occur. P(x|y) denotes the conditional probability that x will happen given that y has happened. Bayes’ rule states that: The complete probability formula states that P(A) = P(A|B)P(B) + P(A| ¬ B)P( ¬ B) or in the more general case Note : P(A) = α P(A|B) + (1- α )P(A| ¬ B) mirrors our intuition that the unconditional probability of an event is somewhere between it’s conditional probability based on two opposing assumptions.
Image of page 8
9 Information Theory: Entropy We want a fundamental measure that will allow us to
Image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern