Unformatted text preview: ECE 534: Elements of Information Theory, Fall 2010
Homework: 9
Solutions Exercise 9.14 (book)
230
Gaussian c signal with
Additive noise channel. Consider the channel Y = X + Z , where X is the transmitted hannel
power constraint P , Z is independent additive noise, and Y is the received signal. Let
14. Additive noise channel. Consider the channel Y = X + Z , where X is the trans
1 mitted signal with p ower constraint0 P , with probability 10 additive noise, and Y is the
Z is indep endent
Z=
received signal. Let
∗
9
Z
1
0, with probability 10
with prob. 10
Z=
9,
Z ∗ , with prob. 10
where Z ∗ ∼ N (0, N ). Thus, Z has a mixture distribution that is the mixture of a Gaussian
distributionZand aN (0, N ). Thus Z has a mixture 1 at 0.
where ∗ ∼ degenerate distribution with mass distribution which is the mixture of a
Gaussian distribution and a degenerate distribution with mass 1 at 0.
(a) What is the capacity of this channel? This should be a pleasant surprise.
(a) What is the capacity of this channel? This should b e a pleasant surprise.
(b) How How wyou signal signal in order to achieve capacity?
(b) would ould you to achieve capacity? Solution: Additive Noise channel
Solution
The capacity of this channel is inﬁnite, since at the times the noise is 0 the output is
exactly equal to the input, and we can send an inﬁnite numb er of bits.
To send information through this channel, just rep eat the same real numb er at the
input. When we have three or four outputs that agree, that should corresp ond to the
p oints where the noise is 0, and we can decode an inﬁnite numb er of bits.
15. Discrete input continuous output channel. Let Pr{X = 1} = p , Pr{X = 0} =
1 − p , and let Y = X + Z , where Z is uniform over the interval [0, a] , a > 1 , and Z
Exerciseendent of X .
is indep 9.16 (Matteo Carminati)
(a) Calculate
Gaussian mutual information. Suppose that (X, Y, Z, W ) are jointly Gaussian and that X → Y →
I (X ; Y ) = H (X ) − H (X Y ).
Z → W forms a Markov chain. Let X and Y have correlation coeﬃcient ρ1 and le t Y and Z have
correlation coeﬃcient ρ2 .IFindYI) X ; Z ).
(b) Now calculate (X ; ( the other way by
Solution I (X ; Y ) = h(Y ) − h(Y X ). In general:
(c) Calculate the capacity of this channel by maximizing over p Solution: Discrete input ContinuoushOutputh(Z ) − h(X, Z )
I (X ; Z ) = (X ) + channel
(a) Since
1
f (Y X = 0) = and
f (Y X = 1) = 1
a 0 1 (1 − p) a 1
a 1 0≤y<a
otherwise
0≤y<1
1≤y≤a (9.92) (9.93) Since X , Y , Z are jointly Gaussian, X and Z are jointly Gaussian too; thus their covariance matrix
will be:
KXZ = 2
σx
σx σz ρXZ σx σz ρXZ
2
σz So, computing the mutual information:
1
1
1
2
2
I (X ; Z ) = log(2πeσx ) + log(2πeσz ) − log2 (2πe)2 KXZ 
2
2
2
1
Substituting the value of the determinant of the covariance matrix we have: I (X ; Z ) = − 2 log(1 −
2 ).
ρXZ
ρ2 is still unknown, but exploiting Markovity (p(x, y z ) = p(xy )p(z y )) we have:
XZ ρXZ = E [XZ Y ]
E [E [X Y ]E [Z Y ]
E [XZ ]
=
=
σx σz
σx σz
σx σz Exploiting the fact that X , Y and Z are jointly Gaussian we can ﬁnally write: ρXZ = ρXY ρY Z and
thus I (X ; Z ) = 1 log(1 − (ρXY ρY Z )2 ).
2 Exercise 9.18 (Johnson Jonaris GadElkarim)
Gaussian channel with timevarying mean. Find the capacity for the Gaussian channel Yi = Xi + Zi .
Let Z1 , Z2 , . . . be independent and let there be a power constraint P on xn (W ). Find the capacity
when:
(a) µi = 0, for all i.
(b) µi = ei , i = 1, 2, . . .. Assume that µi is known to the transmitter and receiver.
(c) µi unknown, but µi i.i.d.∼ N (0, N1 ), for all i.
Solution
1. When µi = 0 capacity will be C = 0.5 log(1 + P/N )
2. When µi = ei , since it’s know at both the transmitter and the receiver, the receiver will be
able to subtract it while decoding and then we will go back the the zero mean case like in
(a), hence C = 0.5 log(1 + P/N )
3. When µi ∼ N (0, N 1), the density distribution of Z: fZ (z ) = N (0, N ) ∗ N (0, N 1) = N (0, N +
P
N 1); where * denotes convolution, hence C = 0.5 log(1 + N +N 1 )
Problem 9.22
In the achievability proof, we pointed that the capacity is the max achievable rate, we deﬁned
(n)
an achievable rate to be able to recover X n under the power constraint with a small Pe that
tend to zero as n go to inﬁnity. If we know X n and since Z n = Y n − X n , hence recovering
Z n is equivalent to recover X n , hence the maximum rate of X n is 0.5 log(1 + P/N ).
2 i = 1, 2, . . . , n, and 1
n n
2
i=1 Xi ≤ P. Here we are interested in recovering the noise Z n and we don’t care ab out the signal
X n . By sending X n = (0, 0, . . . , 0) , the receiver gets Y n = Z n and can fully determine
the value of Z n . We wonder how much variability there can b e in X n and still recover
the Gaussian noise Z n . The use of the channel looks like Exercise 9.22 (book) Zn Recovering the noise. Consider a standard Gaussian channel Y n = X n + Z n , where Zi is i.i.d.
1
∼ N (0, N ), i = 1, 2, . . . , n, and n n Xi2 we are interested in recovering the noise Z n
≤? . Here
P
i=1
n . By sending X , 0, . . . , 0),n (0 n
n
and we don’t care about the signal X theˆreceiver gets Y n = Z n and
X
Y
Z n (Y n )
n . We wonder how much variability there can be in X n and still
can fully determine the value of Z
recover the Gaussian noise Z n . Argue that for some R > 0, the transmitter can arbitrarily send
one of 2nR diﬀerent sequences on xn without aﬀecting the recovery of the noise in the sense that:
Argue that, for some R > 0 , the transmitter can arbitrarily send one of 2 nR diﬀerent
ˆ
P r{Z the Z n } → 0 as n → ∞
sequences of xn without aﬀecting n = recovery of the noise in the sense that
For what R is this possible? ˆ
Pr{Z n = Z n } → 0 as n → ∞ . Solution
For what R is this p ossible? Solution: Recovering the noise
We prove that sup R = C = C (P /N ).
If R < C , from the achievability proof of the channel coding theorem, 2 nR diﬀerent
X n sequences can b e decoded correctly with arbitrarily small error for n large enough.
Once X n is determined, Z n can b e easily computed as Y n − X n . We show that this is optimal by using proof by contradiction. Assume that there is
ˆ
some R > C such that Z n can b e recovered with Pr{Z n = Z n } → 0 as n → ∞ .
But this implies that X n = Y n − Z n can b e determined with arbitrary precision; that
ˆ
is, there is a codeb ook X n (W ), W = 1, . . . , 2nR with R > C and Pr{X n = X n } =
ˆ } → 0 as n → ∞ . As we saw in the converse proof of the channel coding
Pr{W ≤ W
theorem, this is imp ossible. Hence, we have the contradiction and R cannot b e greater
than C. 3 ...
View
Full
Document
This note was uploaded on 01/19/2012 for the course ECE 534 taught by Professor Natashadevroye during the Fall '10 term at Ill. Chicago.
 Fall '10
 NatashaDevroye

Click to edit the document details