This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: M Chapter 15 Network Information Theory 1 The cooperative capacity of a multiple access channel. (Figure 15.1) X1
(W1,W2) < P(y931=$2) Y ———+ (Pi/1,1572)
X2 ___d#l Figure 15.1: Multiple access channel with cooperating senders. (a) 51199038 X1 and X2 have access to both indices W1 6 {1,2”R},W2 E {112nm}
Thus the codewords X1(W1,W2),X2{W1, W2) depend on both indices. Find the
capacity region. (1)) Evaluate this region for the binary erasure multiple access channel Y = X1 +
X2, Xi 6 {0,1}. Compare to the non—cooperative region. Solution: Cooperative capacity of multiple access channel (a) When both senders have access to the pair of messages—to be transmitted, they
can act in concert. The channel is then equivalent to a single user channel with 331 332  Network Information Theory input alphabet 261 x X2, and a larger message set W1 x W2. The capacity of this
' single user channel is C = maxpm) I(X;Y) : maxpmimz) I(X1,X2; Y). The two
senders can send at any combination of rates with the total rate R1 + R2 S C (15.1) (b) The capacity for the binary erasure multiple access channel was evaluated in class.
When the two senders cooperate to send a common message, the capacity is C = max I(X1,X2;Y) = maxH(Y} = log3, {15.2) pm 1632) achieved by (for example) a uniform distribution on the pairs, (0,0), (0,1) and
(1,1). The cooperative and non—cooperative regions are illustrated in Figure 15.2. R2 NIH 01 = 1 H Ccoaper. : 10g3 R1 l
2 Figure 15.2: Cooperative and non—cooperative capacity for a binary erasure multiple access
channel 2. Capacity of multipie access channels. Find the capacity region for each of the
following multiple access channels: Figure 15.3: Capacity region of additive modulo 2 MAC (a) Additive modulo 2 multiple access access channel. X1 6 {0,1},X2 e {O,1},Y 2
X1 33 X2 . (b) Multiplicative multiple access channel. X1 6 {#1, 1},X2 E {—1,1},Y = X 1 'X2. Solution: Examples of multiple access channels. (3.) Additive modulo 2 MA 0. ,
Y : X1 9X2. Quite clearly we cannot send at a total rate of more than 1 bit, since
HOV) S 1. We can achieve a rate of 1 bit from sender l by setting X2 = 0, and ,
similarly we can send 1 bit/transmission from sender 2. By simple time sharing
we can achieve the entire capacity region which is shown in Figure 15.3. (b) Multiplier channel.
X1,X2 E {"1,1},Y =X1.X2.  ‘
This channel is equivalent to the previous channel with the mapping —l—> 1 and
1 —> 0. Hence the capacity region is the same as the previous channel. 3. Cutset interpretation of capacity region of multiple access channel. For the
multiple access Channel we know that [R1, R2) is achievable if R1 < 1(X1; Y l X2), (15.3)
R2 < I(X2; Y J X1), ‘ (1.5.4)
R1 +R2 < I(X1,X2;Y), (15.5) 334 . Network Information Theory W for X1,X2 independent. Show, for X1,X2 independent, that I(X13Y I X2) = I{X1;Y,X2). Interpret the information bounds as bounds on the rate of ﬂow across cutsets 31,32
and 53. Solution: Catset interpretation of the capacity region. We can interpret I (X1; Y: X2) as the maximum amount of information that could ﬂow
across the cutset 5'1. This is an upper bound'on the rate R1. Similarly, we can
interpret the other bounds. 4. Gaussian multiple access channel capacity. For the AWGN multiple access chan—
nel, prove, usmg typical sequences, the achievabiiity of any rate pairs (R1, R2) satisfying 1 P1
R1 < ElOg(1+ F , (15.6)
1 132
R2 < Elogd + F , (15.7)
1 P1 + P2
R1 +32 < Elogﬂk N ). (158) The proof extends the proof for the discrete multiple access channel in the same way as
the proof for the single user Gaussian channel extends the proof for the discrete single
user channel. ' .Solution: Gaussian Multiple Access Channel Capacity. The essence of the proof of the achievability of the capacity region for the Gaussian
multiple access channel is the same as the discrete multiple access channel. The main
difference is the introduction of the power constraint, and the modiﬁcations that have to Network Information Theory 335
W be made to ensure that the codewords satisfy the power constraint with high probability.
We will brieﬂy outline the proof of achievability along the lines of the pro'of in the
discrete cases, pausing only to emphasize the differences. The channel is deﬁned by Y=X1+X2+Z, Z~N(O,N) (15.9)
with power constraints P1 and P2 on the inputs. The achievable rates for this channel
are 7 P1
 A 1 .10
R1 < C (N) ( 5 )
P2 ~
R — 15.11
2 < C (N) ( l
P P
R1+R2 < 0( 1+ 2), (15.12)
N
' where 1
C(32) : 5 log(1 + :13). (15.13) Codebook generation: Generate 27131 independent codewords X1(w1) , 11:1 6 {1,2, . . . ,2”Rl},
of length n, generating each element i.i.d. ~ N(O, P1 .1 6) Similarly generate 2”R2 in—
dependent codewords X2('w2), 102 E {l,2,...,2”R2}, generating each element i.i.d. ~ N(O, P2 — 6). These codewords form the codebook. ' Encoding: To send index 11:1, sender one sends the codeword X1('w1). Similarly, to send 102, sender 2 sends X20112). Decoding: Thereceiver Y” chooses the pair (2‘, j) such that :1 (X1 armory) e A?” (15.14)
I 1 n 2 _ ' i: — Z$1k(Z)S P1 (15.15)
:i' In ; k=l and .5 1 ﬂ . . .E; ~ 23:31.0) s P2 (15.16)
.3 n k=1 if such a pair (1,3) exists and is unique; otherwise, an error is declared. By the symmetry of the random code construction, the conditional probability of error
does not depend on which pair of indices is sent. So, without loss of generality, we can assume that (101,102) = (1,1). An error occurs in the decoding if . mama» ¢ AS“),
 (X1(i),x2(j)) 6 A?” for some 1% 1 or j sé 1, or Egypt: 1 336 Network Information Theory 0 xﬁl) or 19(1) do not satisfy the power constraint. Deﬁne the events Em ={%ZX12;.(1) > P1} (15.17)
13:1
and l n
E02 = {E E ngu) > P2}. (15.18)
1921
For 1'75 0,j % 0,
Es = {(X1(i).X2(j).Y) E Aim}. (15.19) Then by the union of events bound, Peg”) = P (E01 UE02 .U E51 U U{i,j);é(1,l)Eij) (15.20)
S P(E01l + HEM) + Pi i1) + Z PlEu) + Z P{E1j)+ Z “Eel,
isé1,j=1 i=1,j;é1 #1, #1 where P is the probability given that (1,1) was sent. Since we choose the codewords
according to a normal distribution with mean Pi — 6, with very high probability the
codeword power will be less than P. Hence, P(E01) —> 0 and P(Egg) —+ 0. From the
AER P(Ef1) —a 0. By the AEP, for 1' # 1, we have P(E.;) = P((X1(i).X2(1).Y)EA£")) (1521) = f annﬂxw) (1522)
(X1.X2.¥lEAen _ 2—n{h(X1)+h(X2,Y)—h{X1.X2.Y)—3€l (15.23) : 2—n(I(X13X2,Y)—3€) ' (15.24) = 2—n(I(X1;Y]X2)—35) (15.25) = 2—n(0(%%~}36), (15.25} since X1 and X2 are independent, and therefore I(X1;X2, Y) I I(X1;X2)+I(X1; YIX2) =
I(X1; YIXQ) . Similarly, for j 7t 1, Pl E15,.) s Tami—2145), (15.27)
andfori7£1,j#l, PH,
Haj) s 2—“(Cl—l—J‘N HE).  (15.28) It follows that Pg“) 5 P(E01) + 13(th + 130933) + 2nRI2”(0(%t}3v
+2nR22—R(C{%Z)—3e) + 2n(Rl+R2)2—R(C(El$—PZ)~45). (1529) Thus 6 > 0 arbitrary and the conditions of the theorem cause each term to tend to 0
as TL —3‘ 00 . The above bound shows that the average probability of error, averaged over all choices '
of codebooks in the random code construction, is arbitrarily small. Hence there exists .
at least one code C“ 'with arbitrarily small probability of error. The achievabiiity of the capacity region is proved. 15. Twoway channel. Consider the two—way channel shown in Figure 1.5.6. The outputs
Y1 and Y2 depend only on the current inputs X1 and X2. (a) By using independently generated codes for the two senders, show that the follow—
ing rate region is achievable: R1 < IlX1;Y2lX2), (15.135)
R2 < I(X2;Y1X1) (15.136) for some product distribution p($1)p($2)p(y1,y2rl, 3:2) . (b) Show that the rates for any code for a twoway channel with arbitrarily small
probability of error must satisfy R1
R2 I(X1;Y21X2), (15.137)
IlX2;Y1lX1) ’ (15.138) /\ l/\ for some joint distribution p($1,$2)p(y1, y2$1,:c2). The inner and outer bounds on the capacity of the twoway channel are due to Shannoniltl].
He also showed that the inner bound and the outer bound do not coincide in the case of the binary multiplying channel X1 2 X2 = 321 = 322 = {0,1}, Y1 : Kg 2 Xng.
The capacity of the twoway channel is still an open problem. Solution: Tramway channel. (a) We will only outline the proof of achievability. It is quite straightforward compared
to the more complex channels considered in the text. Fix Pl$1)P($2)Plyia y2l$1,_£€2)
‘Code generation: Generate a code of size 27ml of codewords X1 (wl), where the
$1, are generate i.i.d. ~ P($1) Similarly generate a codebook X2(w2) of size 271R? . Encoding: To send index ml from sender 1, he sends X1(w1). Similarly, sender 2 sends X2(w2). Decoding: Receiver 1 looks for the unique tug, such that (X1(w1),X2(w2),Y1) E
_ £1?le , X2, Y1) . If there is no such 102 or more than one such, it declares an error. Similarly, receiver 2 looks for the unique ml, such that (X1(w1),X2(w2),Y2) e A?) (X1, X2, Y2). ’ Analysis of the probability of error: We will only analyze the error at receiver 1. The analysis for receiver 2 is similar. ' VJithout loss of generality, by the symmetry of the random code construction, we
can assume that (1,1) was sent. We have an error at receiver 1 if 0 (x,(1),x2(1),vl) e A?” (X1,X2,Y1). The probability of this goes to o by the law of large numbers as n i1, 00.  l
0 There exists an j% 1, such that (X1(1),X2(3),Y1) 6 Ag” (X1,X2,Y1). Dem the events It = {(X1(1),X2e>,Y1) 6 AW} (15.139)
Then by the union of events bound, Pg“) = P (Bf U 11,113,) (15.140) 5 P(E§) + Ema), (15.141) #1 where P is the probability given that (1,1) was sent. From the AEP, P(Ef) a O. By Theorem 14.2.3, for j % 1, we have P(Ej) = P((X1(1),X2(j),Y1)6A£”)) (15.142) = Z 39(X2)P(X1,y1) (15.143)
(X11X2,y1)EA£") S Mg”)i2n(H(X2)6)2n{H{X1:Y)~€) (15144) S 2””(H(X2)+H(X1,Y)eH(X1,X2,Y)—35) (15.145) 3 2—n(I(X2;X1,Y)—36) (15.146) = 2'“(I{X2iY'X1)‘3E}, (15.147) since X1 anng are independent, and therefore I(X1;X2,Y) = I(X1;X2).+
I(X1;YXg) = I(X1;YX2). Therefore Pg”) g Prep + 2“322”(I<X29YEX113€), (1.5148) Since 5 >7 0 is arbitrary, the conditions of the theorem imply that the probability
of error tends to U as n —+ 00. Similarly, we can show that the probability of error
at receiver two goes to 0, and thus we have proved the achievability of the region for the two way channel. The converse is a simple application of the general Theorem 14.101 to this simple
case. The sets S can be taken in turn to be each node. We will not go into the details . 17. Slepian Wolf
Let (X, Y) have the joint pmf 19(3):, y) where 5 = % — 3E . (Note: This is e joint, not a. conditional1 probability mass function.) (a) Find the Siepian Wolf rate region for this source.
(b) What is Pr[X : Y} in terms of oz? 7 (c) What is the rate region if a =
(d) What is the rate region if as 3 Coll—i WIH 7 Solution: Slepian Wolf (a) H(X,Y) : —§:p($,y) 10gp(33,y) = —30clogo:  Sﬁlogﬁ. Since X and Y are
uniformly distributed
H(X) = H(Y)=10g3 (15.151) and
H(X1Y) = H(Y]X) = H(3oz,3,6, 35’) (15.152) Hence the Slepian Wolf rate region is R1 2 H(X)Y):H(3cx,3,613ﬁ) (15.153)
32 2 H(Y1X):H(3a,3ﬁ,3m (15.154)
Rli—Rg 2 H(X,Y)=H(3a,3ﬁ,3[3)+log3 (15.155) . (1)) From the joint distribution, Pr(X : Y) = 30:.
(c) If 0: = g, [3 = 0, and H(XiY) = H(YX) : 0. The rate region then becomes R1 2 0 (15.156) R2 2 0 (15.157) R1 + R2 2 log 3 (15.158) (d) If a I 31,», = 23—, and H(XY) 2 H(YLX) =10g3. X and Y are independent,
and the rate region then becomes R1 2 log 3 ' (15.159) _ R2 2 iog3  (15.160) R1 + R2 2 210g3 (15.161) 21. Broadcast Channel. Consider the following degraded broadcast channel. 1 1
X 1—051 Y1 1—ng 1Y2 (a) What is the capacity of the channel from X to Y1 ? (b) From X to Y2? (c) What is the capacity region of all (1131,1132) achievable for this broadcast channel?
Simplify and sketch. ' Solution: Broadcast Channel. (a) The channel from X to Y; is 'a standard erasure channel with probability of
erasure = a1 , and hence the capacity is 1 — 031 (b) We can show that the effective channel from X to Y2 is a binary erasure channel
with erasure probability ozl + 652 — 051622, and hence the capacity is 1 — a1 — 0:2 +
011(12 = (I — 01:1)(1 — 012) (c) As in Problem 15.13, the auxiliary random variable U in the capacity region of
the broadcast channel has to be binary. Hence we have the following picture We can now evaluate the capacity region for this choice of auxiliary random
variable. By symmetry, the best distribution for U is the uniform. Let a 2
011+ (22 — @1052, and therefore 1 — a = E : mag. Hence R2 = I(U;Y2) (15.176) = H(Y) —H(Y2U) (15.177) 2 H (gag: g) _H((Bm: 051 +0—104275T012) (15178) = we) +511 (é) — H(0¢) eaHﬁs) (15.179) = aﬂ—HQS». (15.180)
Also R1 = I(X;Y1)U) (15.181) = H(3’1IU)wH(1’EI=U.X) (15.182). 2 H(Ba_1,a1,ﬁ§f)—H(al) (15.183) = a;H(ﬁ)+H(a1)—H(a1) (15.184) 2 5511(5) (15.185) These two equations characterize the bOundary of the capacity region as [3’ varies.
When 5: 0, then R1 = 0 and R2 22:. When a = e, we have R1 :071 and
R2 = O. The capacity region is sketched in Figure 15.13. . 22. Stereo. The sum and the difference of the right and left ear signals are to be individu:
ally compressed for a common receiver. Let 21 be Bernoulli (p1) and Z2 be Bernoulli
(p2) and suppose ,Z1 and Z2 are independent. Let X = Z1 + Z2, and Y = 2'1 — Zg. (a) What is the Slepian Wolf rate region of achievable (R X, Ry) ? There is a. simple way to do this part. Solution: Stereo. The joint distribution of X and Y is shovvn in following table and hence we can calculate _ 1300: 33911023391 +192 — 2p1p2,(1 —P1)(1 P2)) (15.186)
HO”) = HCPIPz + (1 —P1)(1*P2);P1 —p1p2,P2 “191102) (15.187) and .
30121”) = 11131, Zz) = H(pi) + H922) (15.188) and therefore HleY) = 3031) + 3002) — 17001102 + (1 —p1)(1 “102)1391 —P1p2,P2 *Mlsgl
HleX) = H991) + 5802) — 36011921191 +192 — 2191202, (1 —p1)(1 —p2)) (15190) The Slepian Wolf region in this case is R1 2 MM” = HUM) + MP2) — 3001102 + (1 —p1)(l —p2),P1 —p1p2,p2(3x5pl%)
R2 2 Elle) = 3031} + Elm) * H(P1P2api +102 — 2101192: (1 —P1)(1 _P201)5192)
R1 +1132 2 H(p1)+H(p2) (15.193) 364 ' Network Information Theory 31. Parallel Gaussian channels from a mobile telephone
Assume that a sender X is sending to two ﬁxed base stations. Assume that the sender sends a signal X that is constrained to have average power P’.
Assume that the two base stations receive signals Y1 and Y2, where Y1 = 01X+Z1
Y2 2 a2X+Z2 where Z, N N(U,N1), 22 ~ N(0,N2), and Z1 and Z; are independent. We will
assume the a ’s is constant over a transmitted block. (3.) Assuming that both signals Y1 and Y2 are available at a common decoder Y =
{Y1,Y2), what is the capacity of the channel from the sender to the common receiver? (’0) If instead the two receivers Yl and Y2 each independently decode their signals,
this becomes a broadcast channel. Let R1 be the rate to base station 1 and R2
be the rate to base station 2. Find the capacity region of this channel. Soluti0n: Parallel Gaussian channels from a mobile telephone
(a) Let Y = [Y1,Y2]T. Obviously,
I(X: Y) : Ill/(171,16) _ h(Z11Z2) thus it is clear that the maximizing distribution on X is Gaussian NU), P) . There—
fore we have MY}, Y2) = % log 21reKyl and consequently, by independence of the noises
0 _ E10g ”M
2 N1N2 Plugging in leI = (1 — (1)2PN1 + a2PN2 + NlNz we have (1 a 002.? 05213) 1
O=—l 1 —
2og(+ N2 +Ni (b) The problem is equivalent to the degraded broadcast channel with Y1 = X+Zi/Gf
1/2 I X—l—Zg/(l—Oi) Thus, the noise is N(O,N1/a2) and JV (0, N2 / (1—0:)2). 1‘ ’ithout loss of generality
assume that N2/(1 — 002 > N1 /o:2. Then, referring to Example 14.6.6. in Cover
and Thomas, the rate region is ...
View
Full Document
 Summer '10
 M.R.Soleymani

Click to edit the document details