7-network_information - Harvard SEAS ES250 Information...

Info icon This preview shows pages 1–4. Sign up to view the full content.

Harvard SEAS ES250 – Information Theory Network Information Theory * 1 Gaussian Multiple-User Channels Definition The basic discrete-time AWGN channel with input power P and noise variance N is modeled by Y i = X i + Z i , i = 1 , 2 , · · · where i is the time index and Z i are i.i.d. Gaussian r.v., with a power constraint 1 n n X i =1 X 2 i P Definition For the convenience, define C ( x ) as : C ( x ) = 1 2 log(1 + x ) Theorem The capacity C of the basic AWGN channel is obtained by max E [ X 2 ] P I ( X ; Y ) and is given by C P N = 1 2 log 1 + P N Theorem (Gaussian Multiple-Access Channel) The achievable rate region for the Gaussian multiple access channel with m users is: X i S R i < C | S | P N for all S ⊂ { 1 , 2 , · · · , m } . Theorem (Gaussian Broadcast Channel) The capacity region of the Gaussian broadcast channel is R 1 < C αP N 1 R 2 < C (1 - α ) P αP + N 2 where α may be arbitrarily chosen (0 α 1) to trade off rate R 1 for rate R 2 as the transmitter wishes. Theorem (Gaussian Relay Channel) The capacity region of the Gaussian relay channel is C = max 0 α 1 min C P + P 1 + 2 ¯ αPP 1 N 1 + N 2 , C αP N 1 where sender X has power P , sender X 1 (relay) has power P 1 , and ¯ α = 1 - α . * Based on Cover & Thomas, Chapter 15 1
Image of page 1

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Harvard SEAS ES250 – Information Theory 2 Jointly Typical Sequences Definition The set A ( n ) ² of ² -typical n -sequences ( x 1 , x 2 , · · · , x k ) is defined by A ( n ) ² ( X (1) , X (2) , · · · , X ( k ) ) = ( x 1 , x 2 , · · · , x k ) : fl fl fl fl - 1 n log p ( s ) - H ( S ) fl fl fl fl < ², S ⊆ { X (1) , X (2) , · · · , X ( k ) } . Definition We will use the notation a n . = 2 n ( b ± ² ) to mean that fl fl fl fl 1 n log a n - b fl fl fl fl < ² for n sufficiently large. Theorem For any ² > 0, for sufficiently large n , 1. P ( A ( n ) ² ( S )) 1 - ² , S ⊆ { X (1) , X (2) , · · · , X ( k ) } . 2. s A ( n ) ² ( S ) p ( s ) . = 2 n ( H ( S ) ± ² ) . 3. | A ( n ) ² ( S ) | . = 2 n ( H ( S ) ± 2 ² ) . 4. Let S 1 , S 2 ⊆ { X (1) , X (2) , · · · , X ( k ) } . If ( s 1 , s 2 ) A ( n ) ² ( S 1 , S 2 ), then p ( s 1 , s 2 ) . = 2 n ( H ( S 1 | S 2 ) ± 2 ² ) Theorem Let S 1 , S 2 be two subsets of X (1) , X (2) , · · · , X ( k ) . For any ² > 0, define A ( n ) ² ( S 1 | s 2 ) to be the set of s 1 sequences that are jointly ² -typical with a particular s 2 sequence. If s 2 A ( n ) ² ( S 2 ), then for sufficiently large n , we have | A ( n ) ² ( S 1 | s 2 ) | ≤ 2 n ( H ( S 1 | S 2 )+2 ² ) and (1 - ² )2 n ( H ( S 1 | S 2 ) - 2 ² ) X s 2 p ( s 2 ) | A ( n ) ² ( S 1 | s 2 ) | . Theorem Let A ( n ) ² denote the typical set for the probability mass function p ( s 1 , s 2 , s 3 ), and let P ( S 0 1 = s 1 , S 0 2 = s 2 , S 0 3 = s 3 ) = n Y i =1 p ( s 1 i | s 3 i ) p ( s 2 i | s 3 i ) p ( s 3 i ) . Then P { ( S 0 1 , S 0 2 , S 0 3 ) A ( n ) ² } . = 2 n ( I ( S 1 ; S 2 | S 3 ) ± 6 ² ) . 3 Multiple Access Channel Definition A discrete memoryless multiple-access channel consists of three alphabets, X 1 , X 2 , and Y , and a probability transition matrix p ( y | x 1 , x 2 ). 2
Image of page 2
Harvard SEAS ES250 – Information Theory Definition A ((2 nR 1 , 2 nR 2 ) , n ) code for the multiple-access channel consists of two sets of integers W 1 = { 1 , 2 , · · · , 2 nR 1 } and W 2 = { 1 , 2 , · · · , 2 nR 2 } , called the message sets , two encoding functions , X 1 : W 1 → X n 1 X 2 : W 2 → X n 2 , and a decoding function , g : Y n → W 1 × W 2 Definition The average probability of error
Image of page 3

Info icon This preview has intentionally blurred sections. Sign up to view the full version.

Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

What students are saying

  • Left Quote Icon

    As a current student on this bumpy collegiate pathway, I stumbled upon Course Hero, where I can find study resources for nearly all my courses, get online help from tutors 24/7, and even share my old projects, papers, and lecture notes with other students.

    Student Picture

    Kiran Temple University Fox School of Business ‘17, Course Hero Intern

  • Left Quote Icon

    I cannot even describe how much Course Hero helped me this summer. It’s truly become something I can always rely on and help me. In the end, I was not only able to survive summer classes, but I was able to thrive thanks to Course Hero.

    Student Picture

    Dana University of Pennsylvania ‘17, Course Hero Intern

  • Left Quote Icon

    The ability to access any university’s resources through Course Hero proved invaluable in my case. I was behind on Tulane coursework and actually used UCLA’s materials to help me move forward and get everything together on time.

    Student Picture

    Jill Tulane University ‘16, Course Hero Intern