ELE637_Chapter_7.pdf - Channel Capacity Besma SMIDA ECE 595K Chapter 5 Fall 2011 B Smida(ECE 595K Channel Capacity Fall 2011 1 49 Outline Channel

ELE637_Chapter_7.pdf - Channel Capacity Besma SMIDA ECE...

This preview shows page 1 - 10 out of 49 pages.

Channel Capacity Besma SMIDA ECE 595K: Chapter 5 Fall 2011 B. Smida (ECE 595K) Channel Capacity Fall 2011 1 / 49
Image of page 1
Outline Channel Capacity Jointly Typical Sequences Channel Coding Theorem Linear Block Codes Joint source and channel coding Feedback Capacity B. Smida (ECE 595K) Channel Capacity Fall 2011 2 / 49
Image of page 2
Source and Channel Coding Out Channel Coding Source Coding Compress Encode Decode Decompress Noisy Channel In Source Coding Source Coding is the process of compressing the data using fewer bits to remove redundancy. Shannon’s source coding theorem establishes the limits to possible data compression. Channel Coding Channel Coding adds redundancy to protect against channel errors. The Shannon limit or Shannon capacity of a communications channel is the theoretical maximum information transfer rate of the channel, for a particular noise level. B. Smida (ECE 595K) Channel Capacity Fall 2011 3 / 49
Image of page 3
Discrete Memoryless Channel DMC Y X Channel Noisy Definition: A discrete channel is the (physical or abstract) link connecting input X X and the output Y Y , described by the conditional probability p ( y | x ) that the output is y when the input x . Time-Invariant Transition-Probability Matrix: P i , j = p ( Y = y j | X = x i ). P : each row sum=1. Memoryless: p ( y n | x 1 , x 2 , . . . , x n , y 1 , y 2 , . . . , y n ) = p ( y n | x n ) DMC = Discrete Memoryless Channel B. Smida (ECE 595K) Channel Capacity Fall 2011 4 / 49
Image of page 4
Binary Channels Binary Symmetric Channel: X = { 0 , 1 } and Y = { 0 , 1 } X 1 0 1 0 Y 1 - f f f 1 - f Binary Erasure Channel: X = { 0 , 1 } and Y = { 0 , ? , 1 } 1 ? X Y 0 1 0 1 - f f 0 0 f 1 - f Z channel: X = { 0 , 1 } and Y = { 0 , 1 } X 1 0 1 0 Y 1 0 f 1 - f B. Smida (ECE 595K) Channel Capacity Fall 2011 5 / 49
Image of page 5
Symmetric and weakly symmetric channels Weakly Symmetric Channels: All rows are permutations of each other: - Each row of P have has the same entropy All columns have the same sum: - If X is uniform then Y is uniform p ( y ) = x X p ( y | x ) p ( x ) = 1 | X | x X p ( y | x ) = 1 | Y | Symmetric channels: All rows are permutations of each other All columns are permutation of each other Symmetric Channels Weakly Symmetric Channels B. Smida (ECE 595K) Channel Capacity Fall 2011 6 / 49
Image of page 6
Channel Capacity Channel Capacity Definition: Channel capacity ”Information” channel capacity of a discrete memoryless channel: C = max p ( x ) I ( X ; Y ) , where the maximum is taken over all possible input distributions p ( x ). Maximum is over all possible input distributions p ( x ) only one maximum since I ( X ; Y ) is a concave function of p ( x ) for fixed p ( y | x ) We want to find the p ( x ) that maximizes I ( X ; Y ) B. Smida (ECE 595K) Channel Capacity Fall 2011 7 / 49
Image of page 7
Channel Capacity Properties of channel capacity Recall that the mutual information can be written as I ( X ; Y ) = H ( X ) - H ( X | Y ) = H ( Y ) - H ( Y | X ) C 0 C min( H ( X ) , H ( Y )) min(log | X | , log | Y | ) B. Smida (ECE 595K) Channel Capacity Fall 2011 8 / 49
Image of page 8
Channel Capacity n-use Channel Capacity The capacity of n uses of the channel: C ( n ) = 1 n max p ( x 1 , x 2 ,..., x n ) I ( X 1 , X 2 , . . . , X n ; Y 1 , Y 2 , . . . , Y n ) For discrete memoryless channel we have: I ( X n ; Y n ) = H ( Y 1 , Y 2 , . . . , Y n ) - H ( Y 1 , Y 2 , . . . , Y n | X 1 , X 2 , . . . , X n ) = H ( Y 1 , Y 2 , . . . , Y n ) - n i =1 H ( Y i | Y i - 1 , . . . , Y 1 | X 1 , X 2 , . . . , X n ) = n i =1 H ( Y i | Y i - 1 , . . . , Y 1 ) - n i =1 H ( Y i | X i ) n i =1 H ( Y i ) - n i =1 H ( Y i | X i ) = n i =1 I ( X i ; Y i ) nC C ( n ) equals C and the optimal n symbol input distribution is i.i.d.
Image of page 9
Image of page 10

You've reached the end of your free preview.

Want to read all 49 pages?

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture