Hence nding the rate distortion function consists of

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ributed uniformly in {1, 2, . . . , m} and the distortion measure is 0 if x = x ˆ 1 if x = x. ˆ d(x, x) = ˆ ˆ ˆ First, we note that the expected distortion E d(X, X ) equals the probability that X differs from X: ˆ E d(X, X ) = p(x) x p(ˆ|x)d(x, x) x ˆ x ˆ ˆ p(x) Pr X = x X = x = x ˆ = Pr X = X . Hence, finding the rate distortion function consists of minimizing the mutual information under the ˆ constraint Pr X = X ≤ D . To do so, we introduce the help variable E which indicates whether ˆ X equals X or not, i.e., ˆ 0 if X = X E= ˆ 1 if X = X. We further note that for 1, x = 1 ˆ 0, else PX (ˆ) = ˆx − ˆ the expected distortion equals mm 1 . For this choice of distribution we have I (X ; X ) = 0, and since − the rate distortion function R(D ) is nonincreasing in D it follows that R(D ) is zero for D ≥ mm 1 . m−1 Thus, in the following we only consider the case D < m . c Amos Lapidoth, 2012 3 ˆ We have (considering Pr[E = 0] = Pr X = X ≤ D ) ˆ ˆ I (X ; X ) = H (X ) − H (X |X ) ˆ ˆ = H (X ) − H (X |X, E ) − H (E |X ) ˆ ˆ ≥ log m − Pr[E = 0] H (X |X, E = 0) − Pr[E = 1] H (X |X, E = 1) −H (E ) =0 ≤log(m−1) = log m − Pr[E = 0] log(m − 1) − Hb ( Pr[E = 0]) ˆ = Pr[X =X ]...
View Full Document

This note was uploaded on 05/18/2013 for the course EE Informatio taught by Professor Amoslapidoth during the Fall '11 term at Swiss Federal Institute of Technology Zurich.

Ask a homework question - tutors are online