ch3 - 1 EE670 Information Theory and Coding Class home page

Info iconThis preview shows pages 1–10. Sign up to view the full content.

View Full Document Right Arrow Icon
1
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 EE670 Information Theory and Coding ± Class home page: http://koala.ece.stevens-tech.edu/~utureli/EE670/ Class information, assignments and other links. ± Text Elements of Information Theory, Thomas Cover and Joy A Thomas, Wiley, 1991. ² Entropy, Data Compression, Rate Distortion Theory ² Ch. 2- 3 HW Due Feb 3: 2.5,2.11,2.19,2.23,3.3
Background image of page 2
3 Entropy ± Entropy (uncertainty, self information) of a discrete R.V. X: i xi i x p p x p x p x p x p X H 1 log )) ( log( ) ( ) ( 1 log ) ( ) ( ∑∑ = = = In the logarithm, when base 2 is used: information measured in bits (for discrete R.Vs) When base e is used: information measured in nats (for continuous R.V.s)
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 Entropy Properties ± Shift and Scale Invariant ± Depends on the probability of outcomes not on the values of outcomes ± Semi positive ± H(X) >=0 since log (p(x)) <= 0 when p(x) <= 1 ± Other: ± Maximum entropy for R.V. is when outcomes equally likely ± Source coding will provide a justification for the definition of entropy as the uncertainty of a R.V.
Background image of page 4
5 Joint and Conditional Entropy ± Joint Entropy ± Examples: 1. Two fair coin tosses, P(X=x)=1/2, P(Y=y)=1/2 Independent P(X=x,Y=y)=1/2x1/2=1/4 H(X,Y)= 4 x ¼ log 4 = 4 x ¼ * 2 = 2 bits! 2. If X, Y independent P(x,y)=P(x)P(y) H(X,Y)=H(X)+H(Y) = = ) , ( 1 log ) , ( 1 log ) , ( ) , ( ) , ( , y x p E y x p y x p Y X H y x p y x
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
6 Independent RVs X, Y ) ( ) ( ) , ( ) ( ) ( ) ( ) ( ) , ( ) ( ) ( ) ( ) ( ) , ( ) ( 1 log ) ( ) ( ) ( 1 log ) ( ) ( ) , ( ) ( 1 log ) ( ) ( ) ( 1 log ) ( ) ( ) , ( ) ( ) ( 1 log ) ( ) ( ) , ( 1 log ) , ( 1 log ) , ( ) , ( , , ,, Y H X H Y X H x p Y H y p X H Y X H Y H x p X H y p Y X H y p y p x p x p x p y p Y X H y p y p x p x p y p x p Y X H y p x p y p x p y x p y x p y x p Y X H x y x y xy x x y y x y x y xy x + = + = + = + = + = = = ∑∑
Background image of page 6
7 Joint Entropy The joint entropy H(X,Y) can also be expressed as: [] [ ] [ ] ) ( ) ( )] ( log [ ) ( log ) , ( )) ( log( ) ( (log ) ( ) ( log ) , ( log ) , ( Y H X H y p E x p E Y X H y p x p E y p x p E y x p E Y X H + = + = + = = =
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
8 Conditional probability and Bayes’ Rule The knowledge that a certain event occurred can change the probability that another event will occur. P(x|y) denotes the conditional probability that x will happen given that y has happened. Bayes’ rule states that: The complete probability formula states that P(A) = P(A|B)P(B) + P(A| ¬ B)P( ¬ B) or in the more general case Note : P(A) = α P(A|B) + (1- α )P(A| ¬ B) mirrors our intuition that the unconditional probability of an event is somewhere between it’s conditional probability based on two opposing assumptions.
Background image of page 8
9 Information Theory: Entropy We want a fundamental measure that will allow us to
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 10
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 10/12/2009 for the course EE EE 670 taught by Professor Uftureli during the Spring '05 term at Stevens.

Page1 / 29

ch3 - 1 EE670 Information Theory and Coding Class home page

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online