*This preview shows
pages
1–2. Sign up
to
view the full content.*

This
** preview**
has intentionally

**sections.**

*blurred***to view the full version.**

*Sign up*
**Unformatted text preview: **Harvard SEAS ES250 – Information Theory Gaussian Channel * 1 Definitions Definition (Gaussian channel) Discrete-time channel with input X i , noise Z i , and output Y i at time i . This is Y i = X i + Z i , where the noise Z i is drawn i.i.d. from N (0 ,N ) and assumed to be independent of the signal X i . Power constraint E [ X 2 i ] ≤ P. Definition The information capacity with power constraint P is C = max E [ X 2 ] ≤ P I ( X ; Y ) . Theorem The information capacity of Gaussian channel becomes C = max E [ X 2 ] ≤ P I ( X ; Y ) = 1 2 log parenleftbigg 1 + P N parenrightbigg , where the maximum is attained when X ∼ N (0 ,P ). Definition An ( M,n ) code for the Gaussian channel with power constraint P consists of the following: 1. An index set { 1 , 2 ,...,M } . 2. An encoding function x : { 1 , 2 ,...,M } → X n , yielding codewords x n (1) ,x n (2) ,...,x n ( M ), satis- fying the power constraint P ; that is for every codeword n summationdisplay i =1 x 2 i ( w ) ≤ nP,w = 1 , 2 ,...,M. 3. A decoding function g : Y n → { 1 , 2 ,...,M } . Definition A rate R is said to be achievable with a power constraint P if there exists a sequence of (2 nR ,n ) codes with codewords satisfying the power constraint such that the maximal probability of error λ ( n ) tends to zero. The capacity of the channel is the supremum of the achievable rates....

View
Full
Document