Note The solution to this problem is from Section 97 of Cover Thomas First

Note the solution to this problem is from section 97

This preview shows page 3 - 6 out of 6 pages.

Note: The solution to this problem is from Section 9.7 of Cover & Thomas (First Edition).As in the hint, we haveY=X+U, and therefore the distribution ofYhas the shape ofa histogram (that is,fY(y) =piforiy < i+ 1). It is clear thatH(X) =H(X), since3
Background image
discrete entropy depends only on the probabilities and not on the values of the outcomes.NowH(X)=-i=ipilog2pi=-i=ii+1ifY(y)dylog2i+1ifY(y)dy=-i=ii+1ifY(y) log2fY(y)dy=-1fY(y) log2fY(y)dy=h(Y),sincefY(y) =piforiy < i+ 1.Hence we have the following chain of inequalities:H(X)=H(X)=h(Y)12log2(2πe)(Var(Y))=12log2(2πe)(Var(X ) + Var(U))=12log2(2πe)i=1pii2-i=1pii2+112.Since entropy is invariant with respect to permutation ofp1, p2, . . ., we can also obtain a boundby a permutation of thepi’s. We conjecture that a good bound on the variance will be achievedwhen the high probabilities are close together, i.e., by the assignment. . . , p5, p3, p1, p2, p4, . . .forp1p2. . ..8.8Channels with uniformly distributed noise.C=maxp(x)I(X;Y)=maxp(x)h(Y)-h(Y|X)=maxp(x)h(Y)-h(X+Z|X)=maxp(x)h(Y)-h(Z)=maxp(x)h(Y)-log22,where in the last line we have used the fact that the differential entropy of a random variablethat is distributed uniformly betweenαandα+ais log2abits.4
Background image
Furthermore, we see that the output of the channel,Y, is limited to values in the range [-3,3].From a result on distributions with maximum entropy (specifically, see Chapter 12, Example12.2.4), we see thath(Y) will be maximized if we selectp(x) such that the distribution ofYis uniform in the range [-3,3].Now, forp(x= 0) =p(x= 2) =p(x=-2) =13,it is easy to see that the distribution ofYis uniform in the range [-3,3], and from ourprevious discussion,h(Y) = log26.Therefore, we haveC= log26-log22 = log23.Note that one could have arrived at this result using calculus (without explicit knowledgethat the uniform distribution maximizes entropy for a variable with bounded range), but itwould have involved lengthy (and tedious) calculations.
Background image
Image of page 6

You've reached the end of your free preview.

Want to read all 6 pages?

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern

Stuck? We have tutors online 24/7 who can help you get unstuck.
A+ icon
Ask Expert Tutors You can ask You can ask You can ask (will expire )
Answers in as fast as 15 minutes