This preview shows pages 1–3. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: # Chapter 8 Differential Entropy # 1. Differential entropy. Evaluate the differential entropy MX) = — f f 1n f for the
following: (a) The exponential density, f : Ae’Am , a: Z 0.
(b) The Laplace density, f = %Ae\le. (c) The sum of X1 and X2, Where X1 and X2 are independent normal random variables with means n. and variances 012$; = 11 2. Solution: Diﬁerential Entropy. (a) Exponential distribution. h(f) = a f Ae_’“m[lnz\ w mam (8.1)
0 _
= e In A + 1 nats. (8.2)
= log3 bits. (8.3)
A
(b) Laplace density.
°° 1 was: 1
h(f) I —f ~2—Ae [1115 +lnA— Alxﬂdm (3.4)
—00
= ale—mi“ [8.5)
= 111 9; nats. (8.6}
2 log % bits. ‘ k (8.7) (6) Sum of two normal distributions. The sum of two normal random variables is also normal: so applying the result
derived the class for the normal distribution, since X1 +X2 ~ N011 + #2 of + 02)
_ , a 2 7 1
h(f) : i log 2mg; + 03) bits. (8.8) 4. Quantized random variables. Roughly how many bits are required on the average
to describe to 3 digit accuracy the decay time (in years) of a radium atom if the half—life
of radium is 80 years? Note that halflife is the median of the distribution._ Solution: Quantized random variables. The differential entropy of an exponentially
distributed random variable with mean l/A is log 5: bits. If the median is 80 years, then
80 1 Ae_)‘$ dm = — (8.11)
0 2
OZ : __ = , .12
A 80 0 00866 (8 ) N and the diﬁerential entropy is log e/A. To represent the random variable to 3 digits ~
10 bits accuracy would need log e/A + 10 bits 2 18.3 bits. 5. Scaling. Let h(X) = Hfﬂx) log ﬁx) dx. Show MAX) = log idet(A)  +h(X).
Solution: Scaling. Let Y = AX. Then the density of Y is _i —1
My)" IA! (A 30 (813)
Hence
hex) = — fg<y1mg(y)dy (8.14)
= . l—ilﬂA—ly)[1nf(A“1y)10slAl]dy (8.15)
2 _ / rilﬂx) [ln fee—10M] EAldx (8.16)
= h(X)+logiAi. (8.17) 8. Channel with uniformly distributed noise: Consider a additive channel whose
input alphabet .35 = {0, i1, i2} , and whose output Y 2 X + Z , 'where Z is uniformly
distributed over the interval [—1, I] . Thus the input of the channel is a discrete random variable, while the output is continuous. Caicuiate the capacity C : maxpw I[X;Y)
of this channel. Soiution: Uniformly distributed noise We can expand the mutual information
I(X; Y) = MY) w MYIX) : h(Y) a MZ) (8.48) and h(Z) = log2, since Z ~ U(—1,1). The output Y is a sum a of a discrete and a continuous random variable, and if the
probabilities of X are $34, p_1, . . . , p2 , then the output distribution of Y has a uniform
distribution with weight 194/2 for —3 f Y S #2, uniform with weight (11.2 + 131) /2
for —2 S Y s —1, etc. Given that Y ranges from 3 to 3, the maximum entropy
that it can have is an uniform over this range. This can be achieved if the distribution
of X is (1/3, 0, 1/3,0,1/3). Then MY) = log6 and the capacity of this channel is
C = log6 — log2 : log 3 bits. 9. Gaussian mutual information. Suppose that {X,Y,Z) are jointly Gaussian and
that X —> Y —> Z forms a Markov chain. Let X and Y have correlation coefficient p1 and let Y and Z have correlation coefﬁcient p2. Find I (X ;Z). Solution: Gaussian Mutual Information
First note that we may without any loss of generality assume that the means of X ,
Y and Z are zero. If in fact the means are not zero one can subtract the vector of means Without affecting the mutual information or the conditional independence of X , Z given Y. Let
A z ( 0': awaz2pmz ) ,
amazpa‘z 0; be the covariance matrix of X and Z. We can now use Eq. {8.34) to compute
I{X;Z) = MX) + HZ) — h(X, Z
1 1 1
= 510g (271303) + 510g (27:1302) — 5 log (27TelAi) 1
2 ll _ ‘ Now,
E{XZ}
E:E{ZXZY}}
omaz
Emmi/man}
owo'z Edsel”) {we} as ll sz ll (Twoz 99329921; We can thus conclude that 1
1(X3Y) = ~2 10s(1  piypiy) ...
View
Full
Document
This note was uploaded on 01/19/2012 for the course DEPARTMENT ELEC 6151 taught by Professor M.r.soleymani during the Summer '10 term at Concordia AB.
 Summer '10
 M.R.Soleymani

Click to edit the document details