lecture11 notes

Y n pr am mm xn m y n pr a m mm if rhs

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: ⎜ � , xN (m), y N ⎟ = Pr ⎝ Am ⎠ m=m� � ⎡ ⎤ρ �� �� ⎢� � , xN (m), y N ⎥ ≤⎣ Pr A m ⎦ m=m� � � Upper b ound on probability Proof continued Why not just use the union b ound ⎛ ⎞ � � ⎜ � , xN (m), y N ⎟ Pr ⎝ Am ⎠ m=m� � ⎡ ⎤ �� �� ⎢� � , xN (m), y N ⎥ ≤⎣ Pr A m ⎦ m=m� � if RHS is ≥ 1, then it remains so even after b eing raised to a p ower if RHS if ≤ 1, then it increases when raised to a p ower in [0, 1] Let us now compute � Pr A � m�, xN (m), y N �� as a sum over the p ossible encodings of m� Upper b ound on probability Proof continued � Pr A � m�, xN (m), y N � = xN (m�):P � ≤ �� � Y N |X N y N |xN (m) � N (m� ) PX N x � � � N (m� ) PX N x xN (m�) � � N |xN (m� ) r PY N | X N y � �r PY N |X N y N |xN (m) � ≤PY N |X N � � y N |xN (m�) for any r > 0 note that the last expression does not de­ pend on m� b ecause we sum over all the p ossible codes for m� Upper b ound on probability Proof continued Combining results, we obtain that P r[error |m, X N = xN (m), Y N = y N ] ⎡ ⎤ρ �� �� ⎢� � , xN (m), y N ⎥ ≤⎣ Pr A m ⎦ m=m� � ⎡ � � � ⎢ N (m� ) ≤ ⎣(M − 1) PX N x xN (m�)...
View Full Document

This document was uploaded on 03/19/2014 for the course EECS 6.441 at MIT.

Ask a homework question - tutors are online