ds8 - ECE 178 Digital Image Processing Discussion Session...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
ECE 178 Digital Image Processing Discussion Session #8 { anindya,msargin } @ece.ucsb.edu March 2, 2007 The notes are based on material in Thomas” . Entropy is a measure of the uncertainty of a random variable. Let X be a discrete random variable which takes values from an alphabet X and probability mass function p ( x ) = Pr { X = x } ,x ∈ X . For notational convenience, we denote p X ( x ) (probability that random variable X takes up value x ) by p ( x ). The entropy H ( X ) of a discrete random variable X is defined by: H ( X ) = - X x ∈X p ( x )log p ( x ) (1) If the base of the logarithm is b , we will denote the entropy as H b ( X ). If the base of the logarithm is e , the entropy is measured in nats . Generally, logarithms are computed to the base 2 and the corresponding unit for the entropy is bits . For a binary random variable X , where Pr ( X = 0) = p and Pr ( X = 1) = 1 - p , the entropy H b ( X ) can be represented by H ( p ). Thus, H ( p ) = - ( p log 2 ( p ) + (1 - p )log 2 (1 - p )). Example - suppose a random variable X has a uniform distribution over 32 possible outcomes. Since an outcome of X can have one of 32 values, we need a 5-bit number to represent the outcome. Thus, Pr ( X = 1) = 1 / 32 (assuming X can have values 1 to 32 with equal probability). The entropy of the random variable X is H ( X ) = - 32 X i =1 p ( i )log p ( i ) = - 32 X i =1 (1 / 32)log (1 / 32) = log 32 = 5 bits assuming logarithm to base 2. Thus, the entropy equals the number of bits required to represent X . In this case, if we use a 5-bit number, we can represent X exactly (with no uncertainty). Therefore, the entropy of a random variable is called a measure of its “uncertainty”. We now consider an example where X follows a non-uniform distribution. Suppose, we have a horse race with 8 horses taking part. Assume that their probabilities of winning are (1/2, 1/4, 1/8 , 1/16, 1/64, 1/64, 1/64, 1/64). We can calculate the entropy of the horse race (
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 06/12/2009 for the course ECE 178 taught by Professor Manjunath during the Winter '08 term at UCSB.

Page1 / 4

ds8 - ECE 178 Digital Image Processing Discussion Session...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online