This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 53 ' MULTIPLE USER INFORMATION THEORY FOR THE GAUSSIAN CHANNEL Thomas M. Cover Stanford University ABSTRACT. We consider several multiple user information channels.
Included are Shannon channels, broadcast channels, multiple access
channels, and relay channels. The key ideas needed to achieve capacity are presented for the Gaussian version of these channels. 1. INTRODUCTION The general problem in multiple user information theory consists
of many users attempting to communicate with one another in the
presence of arbitrary interference and noise. See van der Meulen
[l] for a review of this field. Here we attempt to isolate some of the basic proof techniques. These techniques involve the following ideas: l) Random coding,
2) Superposition coding, 3) List codes, 4) SlepianWolf binning,
5) Block Markov decoding, 6) Joint typicality, 7) Convexifica
tion through timesharing. »“ We restrict attention to Gaussian channels, where joint typi
cality is equivalent to a Euclidean distance condition. This dis
cussion is expanded in a paper coauthored with Abbas El Gamal,
which may appear in the Proceedings of the IEEE [2]. 2. GAUSSIAN MULTIPLE USER CHANNELS The basic discrete time additive white Gaussian noise channel with
input power P and noise variance N is modeled by 54 . = . + . ‘ = ... , Y1 x1 Z1 , 1 T,2, where 21 are independent identically distributed Gaussian randa
variables with mean zero and variance N. The signal 5 =
(x},x2,...,xn) has a power constraint n l/n Z x? s P.
i=l The Shannon capacity C, obtained byzmaximizing I(X;Y) over al”
random variables ’X such that E X s P is given by C = l/Z log(l + P/N) bits/transmission. (2.l) The continuous time Gaussian channel capacity is simply re—
lated to the discrete time capacity. If the signal x(t),
O s t s T, has power constraint P and bandwidth conStraint w,
and the white noise Z(t), 0 s t s T, has power spectral densit
N, then the capacity of the channel Y(t) = x(t) + Z(t),
0 s t s T, is given by C h N log(l + P/NH) bits/second. I (2.2) The relationship between (2.l) and (2.2) can be seen informally
by replacing the continuous time processes by n = 2TH independ
samples from the process and calculating the noise variance per
sample. The full theory establishing (2.2) can be found in Nyne:
[3], Gallager [4], and Pollack, Landau, and Slepian [5]. Having said this, we restrict our treatment to time discrete
Gaussian channels. Random codebook: Shannon observed in l948 that a randomly
selected codebook is good with high probability when the rate R
of the codebook is less than the channel capacity C = max I(X;YV
As mentioned above, for the Gaussian channel the capacity is giv:
by C = (l/2) log(l + P/N) bits per transmission. We now set up a codebook that will be used in all of the
multiple user channel models below. The codewords comprising th
codebook are vectors of length n End power P. To generate 5w
a random codebook, simply choose 2n independent identically dis—
tributed random nvectors {gjl),§(2),...,§(2"R) }, each consist‘
of n independent Gaussian random variables with mean zero and
variance P.‘ The rate R will be specified later. Sometimes we
will need two or more independently generated codebooks. In the continuous channel case, one simply lets the white
noise generator of power P and bandwidth w run for T seconds. w “Mumum m Every T seconds, a new codeword is generated and we iist them
until we fiii up the codebook. Now we anaiyze the Gaussian channeis shown in Figure 2L1. Z~N(O.N) SHANNON
QHANNEL MULTIPLE
ACCESS ‘ ‘
CHANNEL ' Z1~N(0,N1) W P x 22~N(0,N2) w CHANNEL
Ye Y1 3 22~N(O.N2) QEQRADED
RELAY
(2HANNEL Z~N(O.N) INTERFERENQE
§;HANN§L Fig. 2.]. Gaussian Muitipie User Channels 55 56 2.l The Gaussian channel Here Y = x + 2. Choose an R < C = l/2 log(l + P/N). Choose any index i in the set 2nR . Send the ith vector .5(i) from the
codebook generated above. The receiver observes Y = Eli) + 2,
then finds the index i of the closest codeword to Y, If n is
sufficiently large, the probability of error P(i xi) will be arbi
trarily small. This minimum distance decoding scheme for the
Gaussian channel is essentially equivalent to finding the codeword
in the codebook which is jointly typical with the received vector Y. 2.2 The multiple access channel
 m We consider m transmitters, each of power\ P. Let Y = 2 xi + Z.
1 . The capacity region for the Gaussian channel takes on the
simple form given in the following equations: Ri < C(P/N)
R1 + Rj < C(2P/N) ﬁgs
Ri f Rj + Rk < C(3P/N)
m 
2 R1 < C(mP/N), (2.3)
l
where
C(x) = l/2 log (l + x) (2.4) denbtes the capacity of the Gaussian channel with signal to noise
ratio x. When.all the rates are the same, the last inequality
dominates the others. ' ‘ nR. Here we need m codebooks; each with 2 1 codewords of. power P. Transmission is simple. Each of the_independent trans
mitters chooses whatever codeword he wishes from his own codebook.
The users simultaneously send these vectors. The receiver sees the codewords added together with_the GauSsian noise 2, Optimal decoding consists of looking for the m codewords,
one from each codebook, such that the vector sum is closest to Y
in Euclidean distance. The set of m codewords achieving the .
minimum distance to Y corresponds to the hypothesized collection of messages sent. _ 57
If (R1,R2,...,Rm) is in the capacity region given above, then the probability of error goes to zero as n tends to infinity. rRemarks: It is interesting to observe that the sum of the
rates CimP/N) of the users goes to infinity with m. Thus in a
cocktail party with m people each of power P in the presence
of ambient noise N, although the interference grows as the number
of speakers increases, the intended listener receives an unbounded
amount of information as the number of people goes to infinity. A ..similar conclusion holds of course for ground communications to a
.satellite. ' It is also interesting to note that the optimal transmission
scheme here does not involve time division multiplexing. In fact,
each of the transmitters utilizes the entire time to send his
meSsage.  A practical consideration for ground transmission to a satel—
lite involves the possible inability of the ground communicators
to synchronize their transmissions. Nonetheless, it can be shown
that the capacity is unchanged when there is a lack of synchroni
zation [6]. 2.3 The broadcast channel Here we assume that we have a sender of power P and two distant
receivers, one with noise spectral density N] and the other with
noise spectral density N2. Without loss of generality, assume
N] < N . Thus in some sense receiver Y1 is better than receiver
Y The model for the channel is Y] =x + Z1 and Y2 = x + 22, w ere Z1 and 22 are arbitrarily correlated Gaus51an random
variables withvariances \Ni land N2 respectively. The sender wishes to send independent messages at rates R] and R2 to
receivers Y} and Y2 respectively. Fortunately, all Gaussian broadcast channels belong to the
class known as degraded broadcast channels. The capacity region
for the Gaussian broadcast channel is given by R1 < C(oaP/NT) R2 < C(a P/( up + N2)), 7 (2.5) where 0 s a s l, a = l  u. The parameter a may be arbitrarily
chosen to trade off rate R1 for rate R2 as the transmitter
wishes. ' To encode the messages, the_receiver generates two codebooks,
one with power aP at rate R], and another codebook with power 58 and R2 to satisfy the 3'? and rate R2.
HR] ] He has chosen R] equation above. Then, to send an index i c {l,2,...,2 and nR
j e { l,2,...,2 2 to Y] and Y2 respectively, he takes code— word Eli) from the first codebook and codeword iii) from the
second codebook and computes the sum. He then sends the sum over the channel. Two receivers must now do the decoding. First consider the bad receiver Y2.
the closest codeword to his received vector 12 . His effective signaltonoise ratio is EP/oP + N2, since Y1's message acts as
noise to Y2 . The good receiver Y1 first decodes Y2's codewor
which he can do because of his lower noise N1. He subtracts this codeword 52 from 'XJ . This leaves him with a channel of power
uP and noise N1. He then looks for the closest codeword in the first codebook to 14 — 52‘ . The resulting probability of error
can be made as low as wished. A nice dividend of optimal encoding for degraded broadcast
. channels is that the better receiver Y1 always knows the messag:
intended for receiver Y2 in addition to the extra information intended for himself. 2.4 The relay channel For the relay channel, we have a sender X] and an ultimate in
tended receiver Y. Also present, however, is the relay channel
intended solely to help the sender. The channel is given by Yl = x1 + Zl Y x1+ 21+ x2 + 22 , (2.6) 2 where 21,22 are independent zero mean Gaussian random variables
with variance N],N2 respectively. The allowed encoding by the
relay is the causal sequence X2i = fi(yll’yl2""’ylil) ' (2‘7) The sender X} has power P] and the relay X2 has power P2. The capacity is given by
P1+P2 +2x/E P1P2 OLP1
C = max min C(W), , 0 <a <1 1 He merely looks through the second codebook for 5 ‘first codebook has 2 59
where E'= l a .
Note that if
P2/N2 2 P1/N1 , it can be seen that C* = C(P1/NI). (This is achieved by .a = l.)
The channel appears to be noise free after the relay, and the
capacity C(P1/N1) from x1 to the relay can be achieved. Thus the rate without the relay C(P1/(N1 + N2)) is increased by the
relay to C(P1/N1). For large N2, and for PZ/NZ 2 P1/N], we
see that the increment in rate is from C(P1/(N] + N2)) m 0 to
C(P1/N1). "  _ \ Encoding of information: Two codebooks are needed. The nR nR
1 words of power oP1. The second has 2 codewords of power' 6P1 . We shall use words from these codebooks successively in order to create the opportunity for cooperation by the relay. We start by sending a codeword from the first code— book. The relay now knows the index of this codeword since R] < C( aP1/N1), but the intended receiver does not. However, the intended receiver has a list of possible codewords of size "(RIC(aPI/NI + N2)) a 2
list codes. The last calculation involves a result on In the next block the relay and the transmitter would like to
cooperate to resolve the receiver's uncertainty about the previously
sent codeword on the receiver's list. Unfortunately, they cannot
QUTte be Sure what this list is. They do not know the received
Sganal Y. Thus they randomly partition the first codebook into 2 0 cells with an equal number of codewords in each cell. The
relay. the receiver, and the transmitter agree on what this parti—
tion is. 'The relay and the transmitter find the cell of the par—
tition in which the codeword from the first codebook lies and
cooperatively send the codeword from the second codebook with that
index. That is, both X} and X2 send the same designated code
word. The relay, of course, must scale this codeword so that it
meets his power constraint P2. They now simultaneously transmit
their codewords. An important point is that the cooperative infor
mation sent by the relay and the transmitter is sent coherently. 50 the power of the sum as seen by the receiver Y is (at NE)? However, this does not exhaust what the transmitter does in
the second block. He also chooses a fresh codeword from his first 0 . i
E 60 codebook, adds it "on paper” to the cooperative codeword from his
second codebook, and sends this sum over the channel. ' The reception by the ultimate receiver Y in the second block
involves first finding the cooperative index from the second code
book by looking for the closest codeword in the second codebook. He subtracts it off, then calculates a list of indices of size n
2 0 corresponding to all transmitted words from the first book
which might have been sent in that second block. Now it is time for the intended receiver Y to finish com puting the codeword from the first codebook sent in the first block.' £ He takes his list of possible codewords that might have been sent
in the first block and intersects it with the cell of the partition
that he has learned from the,cooperative relay transmission in the
second block. Since the rates and powers have been chosen judi
ciously, it is highly probable that there will be only one codeword
in this intersection, This is codeword Y‘s guess about the in
formation sent in the first block. We are now in steady state. In each new block, the transmit
ter and the relay cooperate to resolve the list uncertainty from
the previous block. In addition, the transmitter adds some fresh
information from his first codebook to his transmission from the
second codebook and transmit the sum. The receiver is always one block behind, but for sufficiently
many blocks, this does not affect his overall rate of reception. 2.5 The interference channel In the interference channel, we have two senders and two receivers.
Sender l wishes to send information to receiver l. He does not
care what receiver 2 receives or understands. Similarly, with
sender 2 and receiver 2. As can be seen, this channel involves
interference of each user with the other. It is not quite a broad
cast channel because there is only one intended receiver for each
sender, nor is it quite a multiple access channel because each
receiver is only interested in what is being sent by the corres' ponding transmitter. This channel has not been solved in general, even in the
Gaussian case. But remarkably, in the case of high interference,
Carleial [7] has shown that the solution to this channel is the
same as if there were no interference whatsoever. To achieve this,
generate two codebooks, each with power P and rate C(P/N). Each
sender independently chooses a word from his book and sends it.
Now, if the interference is sufficiently high, the first receiver ‘ " Writi“ 'n 61 can understand perfectly the index of the second transmitter. He
finds it by the usual technique of looking for the closest codeword
to his received signal. Once he finds this signal, he subtracts it
from his received waveform. Now there is a clean channel between
'the receiver and his sender. He then searches his sender‘s code
book to find the closest codeword and declares that codeword to be
the one sent. 3. CONCLUSIONS Random codes were used for all the results stated in the last sec
tion. Convexificatioh through time sharing is generally needed for
*.rate regions involving more than one rate. Thus time sharing
appears not to be needed for the Shannon channel and the relay
channel. Superposition coding achieves nothing new in the multiple
access channel (unless there is feedback), but appears to be essen
tial for the broadcast and relay channels. Finally, list codes, » SlepianNolf binning, and block Markov encoding arise in the relay
channel. It should be mentioned that there are more proof techniques
that would have to be mentioned if we were to discuss all of the
multiple user channels that have been studied to date. REFERENCES l. E.C. van der Meulen, “A Survey of Multiway Channels in Infor—
mation Theory: 1961l976," IEEE Trans. on Information Theory,
Vol. IT—23, No. 2, January 1977. 2. A. El Gamal and T. Cover, "Multiple User Information Theory,“
submitted as an Invited Paper to Proc. IEEE, 1980. 3. A.D. Nyner, “The Capacity of the Bandlimited Gaussian Channel,"
Bell System Tech. J., Vol. 45, March l965, Pp. 359371. 4. R.G. Gallager, Information Theory and Reliable Communication,
Wiley, New York, 1968. 5. D. Slepian and H.0. Pollak, "Prolate Spheroidal Nave Functions,
Fourier Analysis, and UncertaintyI,"_Bel1 System Tech. J.,
Vol. 403 pp. 4364. (Also see Landau and Pollak for Parts II
and III . 5. T. Cover, R.J. McEliece, and E. Posner, "Asynchronous Multiple
Access Channel Capacity,” Technical Report No. 35, Dept. of
Statistics, Stanford University, Stanford, CA, December 1978,
to appear IEEE Trans. on Information Theory. 7 A.B. Carleial, "A Case where Interference does not Reduce
Capacity,” IEEE Trans. on Information Theory, Vol. IT21,
September 1975, pp. 569—570. ...
View
Full Document
 Spring '10
 Baltazar
 Cryptography, Information Theory, Codebook

Click to edit the document details