This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: ECE 5620 Spring 2011 Prelim Exam: Mar. 10, 2011 ,
Name: fslvl 90A; NetID: Rules: 0 Do not open this exam until you are instructed to do so. 0 You are permitted one letter—sized crib sheet. Otherwise the exam is closed—
book and closednote. o No calculators are permitted. 0 No collaboration is permitted. 0 You must justify your answers to get full credit.
0 Points may be deducted for incorrect statements. 0 You have two hours to complete this exam. Problem Points Score Good Luck! 1. Han’s Inequality Suppose that X1, X2, and X 3 are discrete random variables. (a) (2 pts) Show that H(X1X2;X3) +H(X21X11X3) +H(X3X1,X2) S H(X1,X2,X3)
8‘, fl... CLa‘w wall! H(¥.’A’y,) . MY.) + mm x.) + “(M an.) [I N] {Page (edit'9‘“) "JV"! {A "H, W‘ L'“ \ALX\,X1,¥J)>. Htmth + Mum“) 4 Mahala)
[l ti] (b) (4 pts) Show Han’s inequality for three variables:
2H(X1,X2,X3) S H(X1,X2) + H(X1,X3) + H(X2,X3). Hint: Apply the chain rule three different ways and then apply the
result in (a). i 3 t .’
Arf'V‘r) ﬁt. Chain (ML, by y" HUG/‘15) = NW“) " ””3"“.“1/ 0 l”) = Hl‘l.,>(,) + H’(X1IX\¥J) } (l:
' it") : H‘ya’i‘?) 'l H( i‘limlyj) AJJM) ‘H‘fﬁ: 3 ﬁfv‘ielnl +DJ'H" an} Irr'yc'v) (a) ijf 7H()(H5(Hy,) é Hixt’x‘) + Man») +. Hung?) + “Hwy,” [l rﬂ an lquhh) s Huang} + HM») + Mali?) 0 H] (c) (3 pts) What are the conditions for equality in Han’s inequality? Fldm (H W1 Vths an.) = H(X\X~,Y3) <9 w (mg) [' H] MM») —. “(MRIr} (—.—> X,1Lv_,\x (I ,g Highly” : Hlv‘) + His/”13h” : ny‘, + MAN.) 4 HIVJHJ tiny.) J. MW.) 4 "lb/9 “Lu/L if 4“,: I’ll K" 8!, 5v! #3 Ute iu/‘yev/uf'. a ij I 3 2. Vector Binary Channels (a) (5 pts) Consider a discrete memoryless channel that takes two bits
(X 1, X 2) as input and produces two bits (Y1, Y2) as output. With prob
ability p, both bits are ﬂipped, and with probability 1 — p, neither of
the two bits is ﬂipped. More precisely, we have Pr((Y1,Y2) = (1 —X1, 1 — X2» :10
PI"((Y1,Y2) = (X1,X2)) = 1 P Determine the capacity of this channel. What is an optimal input dis—
tribution? L+ (an) 5‘ 4L 4»... mm up (K311) +L 4“. MM: C. = M,‘ I(¥\IY2}Yl\r1/ 0 fl]
1 r(¥.)¥~a/ F0! ”‘I (X‘IITJI
T ”n“? Yul») = Hint“) — “(Y'IYI ix‘ly‘) Au!
Mann) 6 2 0 Ya Mama“) = Mr} 0 ,r] TL», C‘ 2..\4(,) m +1.? u ”in”: ti {Yuma} 'ir Uri”tn... dvo' $00, dl’ “1”} Cl r‘l’] / f0 C 7 1  “W [l H] (b) (2 pts) Consider now a channel that is discrete but not memoryless.
Speciﬁcally, the channel is binary—input and binary—output, and for
each i Z 1, the inputs at times 22' — 1 and 21' are both ﬂipped with
probability p and neither is ﬂipped with probability 1 — p, indepen—
dently for each 2'. Thus the channel is similar to a BSC except that the
“noise” remains constant for two time periods. Compute the capacity
of this channel. Hint: Use the result in (a).
L¢+ (a dent 'er Punch} 9n (0]; .09 <1 mm. ﬁr, coy«W. Ufﬂ'} ‘H‘I ﬁl‘uwl 7“ 4'9“”! ’2’ ‘SV‘VV'Ih" 4’ 'HN (Lovul Cu (4) A 'L‘wnc; f0 CL: 5; 0N] UNIHr} ‘ (c) (7 pts) Now consider the channel in (b) with feedback; that is, the
encoder causally observes the channel output. Consider an arbitrary
feedback code. Let W be a randomly chosen message, and let Y” denote the channel output. For simplicity we assume that n is even.
Show that 1(W; Y”) S 7101,, where Cb is the capacity that you found in (b). Remark: By combining this result with Fano’s inequality, one can
show that feedback does not increase the capacity of the channel in
(b), even though this channel has memory. You do not need to show
this, however. 1w; = W W, , 0 r0 F1 ’l'LQ letywlwq our] (ankmlily Ewan, Hm) é .93 “W 5 V‘
L:\
O “3 Cl (*3 3. WorstCase Noise (a) (2 pts) Argue that Shannon capacity of a discrete memoryless channel,
when Viewed as a function of the conditional PMF p(ym), is convex. C = Wei I(‘;"/
fix) a, 1(qu rt») [WU I(%"T) if COIV" :A r(\/‘*/ ( +
foodie” 5" ”W" C ’3 . a») ‘VN hum“. at (nun. (b) (4 pts) Consider the class of binary—input binaryoutput discrete mem—
oryless channels of the form where p + q = 1/4. Subject to the constraint 1) + q = 1/4, what value
of p minimizes the capacity? What is this minimum capacity? Lg} C(f) Jwak Hz ("1:349 u a 'Puuehu.‘ at r .. .L ,
Olin“ 'Cuv) if coww ‘M r 0" ((f)  C(Y y frwmm.
é (Gt) 6 ELCU'M' '5‘C(:‘;p/ (1 r9
‘= {can + 31“,, (I ya
: C(r/ ’ ‘ N:l:l¢:2¢!
L.” r : «J ‘l’K “ M?) Ct r0 1w I!» +Lt "I“.lmo“. (I'ncJ‘k’ (c) (4 pts) Consider the same class of channels as in (b). Subject to the
constraint p + q = 1/4, what value(s) of p maximizes the capacity?
What is this maximum capacity? m rem. m we”, w we... (0/6 06d/+("¢9)C(al [‘ ”3 RA C (15/ = cw) D H] Jo ((P/ 4 Cd“) =— cw}. ' Maw“1:0 w. 'rk mm... corUH "J‘ 4. Typical Sets For each of the following densities, compute and plot the typical set A22).
That is, show the region in the 2D plane corresponding to Ag") with n = 2. (a) (5 pts) f(w) = 73—; exp(~zv2/2) x e R A? = {ow How ~ Luca/ex} 0,9 7‘ {6‘0“}: I'T‘W [3% “r(’ 7'12“} ‘ ‘zL’yg‘HJ’ e t]?
0 r9 W” ”‘1 *l é 72,5273 0 r0 : fo'/¥7‘j: ‘ 1. 10 (b) (3 pts) f(ac) = ,uexp(—,ux) $2 0. A: = {(M) e at? ' 1'1”» fe'””‘”" M i l e f} 0 ”3 \ Mt”) C MW Wand/4 ‘ } o a 11 (C) (1130M) = a x E'[0,a].
A: : f (ﬁn/n] 5 (0.431: "a“ M (f/‘_ A, a I 6 t3 {0%) {@101} 12 5. Shufﬂing Increases Entropy Consider two “cards,” modeled as a pair of discrete random variables (X 0, X1).
Let S be a Bernoulli(1 / 2) random variable that is independent of the pair
(X0, X1). Let T = 1 — 5', so that T is also Bernoulli(1/2). Consider the “shufﬂed” cards (X s, XT). (a) (4 pts) Show that shufﬂing does not decrease entropy by showing that H(XSaXT) Z H(X07X1)‘ Hm“)? “WWW 5 “7
= :‘Hlxwlhﬁ + ﬂunmrw) 0 ﬂ : :l;l(X,IX.If=o/ + {Mm/mire!) [‘ H] t “ill‘llyvlyJ + %H{Y"y’) [\ #3 2 [4(1ij 13 (b) (5 pts) A pair of random variables (X0, X1) is called exchangeable
if (X0, X1) and (X 1, X0) have the same distribution. Does shufﬂing strictly increase the entropy if (X0, X1) is exchangeable? Prove or
disprove. m 0 w I: (w "aw, H.“ 14 (c) (4 pts) Suppose S is allowed to depend on (X0, X1). Is it possible then
for shufﬂing to decrease entropy? Explain. YCJ’ [( P") _ fVN‘H X0 X‘ y: C“ ; "2. l} airbag LtL I 7 0,) INA % C. [( r} 'rfv 4N] Vl‘:' C.v}'l £41;le ‘H(¥u’¥\) "'1 f‘ r9 HIXI,¥7) < 7. Lf(lvk )(j X1 ;3‘ “0+ Unit“. on” $00,011 ”(My 3‘“ yniirdu/ “({IJ : [’{T to] : 0‘ [ M] 15 ...
View
Full Document
 '08
 WAGNER

Click to edit the document details