lecture11 notes

N px n x y n xn 1 1 y n xn 1 1 upper b ound

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: � �r ⎤ρ PY N |X N y N |xN (m�) ⎥ � �r ⎦ PY N |X N y N |xN (m) Upper b ound on probability Proof continued Averaging the error over all p ossible codes: Ecodebooks[Pe,m] ≤ ⎡ (M � � �⎢ � N (m) PX N x ⎣ y N xN (m) � �1−rρ� NN − 1)ρ PY N | N y |x (m) X ⎡ ⎢ ⎣ � xN (m� ) PY N | X N PX N � � � N (m� x ) �� N |xN (m� ) r ρ y Upper b ound on probability Proof continued 1 Picking r = 1+ρ implies 1 − rρ = r so Averaging the error over all p ossible codes: Ecodebooks[Pe,m] ≤ ⎡ (M PY N |X N QED! � � � ⎢� N PX N x ⎣ y N xN � � � 1 1+ρ y N |xN 1+ρ − 1)ρ Upper b ound on probability Have we used the DMC nature of the chan­ nel? Only insofar as it provides block-by­ block memorylessness. Let us now make greater use of the DMC assumption We assume PX n (xn) �N = i=1 PX (xi) so Ecodebooks[Pe,m] ≤ ⎡ N �� �� ⎣ (M ... ... PX (xi) y1 yN x1 xN i=1 1 �1+ρ PY |X (yi|xi) 1+ρ ⎡ N � � �� ρ ⎣ = (M − 1) ... PX (x) y1 yN i=1 x �1+ρ 1 PY |X (yi|x) 1+ρ N � �� � ρ = (M − 1) PX (x) x i=1 y 1 �1+ρ PY |X (yi|x) 1+ρ ⎡ ⎤1+ρ 1 �� ρ{ 1+ρ ⎦ ⎣ = (M − 1) PX (x)PY |X (yi|x) }N y xN − 1)ρ � Upper b ound on probability From our definition of M and R, M − 1 ≤ 2N R Hence Ecodebooks[Pe...
View Full Document

Ask a homework question - tutors are online