Homework 5-Information theory and Coding
V Balakrishnan
Department of ECE
Johns Hopkins University
October 16, 2006
1
1.1
Problem 7.16-Encoder and decoder as part of channel
Part a
The decoder will choose the codeword whose probability is maximum given th
Homework 7-Information theory and Coding
V Balakrishnan
Department of ECE
Johns Hopkins University
October 30, 2006
1
Problem 3.1
NOTE:I use the books convention
The generator matrix would simply be
0
1
1
1
1
1
1
0
1
1
0
1
1
0
1
1
1
0
0
0
0
1
0
0
0
0
1
0
Homework 2-Information theory and Coding
V Balakrishnan
Department of ECE
Johns Hopkins University
September 25, 2006
1
Problem 3
We know that
p(x)log (p(x) 0
H (p) =
xX
the minimum value will be attained if each of the term p(x)log (p(x) = 0 which is pos
Homework 1-Information theory and Coding
V Balakrishnan
Department of ECE
Johns Hopkins University
October 30, 2006
1
Problem 1
Consider
Y = log (X )
the cumulative distribution of Y is
F (y ) = P (Y y ) = P (log (X ) y ) = P (X ey ) = 1 ey
So the probabi
Homework 1-Information theory and Coding
V Balakrishnan
Department of ECE
Johns Hopkins University
September 18, 2006
1
Problem 1
Consider
Y = log (X )
the cumulative distribution of Y is
F (y ) = P (Y y ) = P (log (X ) y ) = P (X ey ) = 1 ey
So the proba