This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: Probability and Statistics Final Exam
2009 1. (16%) X1 and X2 are independent random variables, each followingr a uniform
distribution U(1, 1). Let Xmgn=min(Xl, X2) and Xmax=max(X1, X2). (a). (Cl Are Xmm and Xmax independent? (Please show your answer based on the
definition of independence, not your intuition) (2%) Please find the joint CDF of Xmin and Xmax: FXmmjxﬁa, (x, y) (8%) Please find E(Xmax  Xmin ) (5%) . ﬂ ' ". .. y Vii—
Ind: s21?" ‘g' \g' / '
. ' 3, 4‘ U 2. (12%) K students participate in a party game where each student picks a number from 1 to N uniformly and independently. The rule is that whoever picks the same number as others is the winner. For example, if David and Mary both pick 7, they are both winners. Let X be the number of winners and Y is the number of those numbers being picked by the winners. (a) lb) 3. (13%) X and Y have a joint PDF as follows: ‘16:” if"? :J’g
eezbﬁexy—ryzj/R
mm = — '  2
. )r Jo , _i .1
‘5“ 2CiVQJJO‘Z 5 Are X and Y independent? (Please show your answer based on the
definition of independence, not your intuition) (2%)
Please find E(X) and E(Y) (10%) Are X and Y indep
daft d ? 2% +ig({ ”
Let V=aX+bY and W=cX+dY. least.l find a, b, c, and d such that V and W are
independent random variables. (6%) : ﬁ =_ L
2 = XY. Derive f2. (5%) W; J3 (,jr
(Hint: For a realevalued symmetric matrix ﬁﬂwe can find that A=UDU’
where D is a diagonal matrix with its element being the eigenvalues of A, and U is a unitary matrix with columns that are n orthonormal eigenvectors
of A.) 4. (21%) David lives near the area where 3 MTA lines merge as shown in the figure below. Based on his experience, he finds out ii. The trains arrive in the blueline, red—line and greenline stations following Poisson distribution, each with a rate of 1/4, 1/2, and 1/4 per minute, respectively.
When a train arrives at any one of the above three stations, the probability that the train has an empty seat is 0.5. iii. Whenever a train arrives in any of the three stations, there will be K people
(excluding David) waiting for the hust is uniformly distributed between 0
and 4. However, more people are waiting in the main station. Therefore, K (again, excluding David) is uniformly distributed between 0 and 16 for the main station.
iv. Everyone waiting for the train has an equal probability to get the empty seat.
\\blueline
Q.
‘wé‘ red line Main station it. 1“ 194.2352?
,fﬁ _ NTU
(ta’5'
,..!’M§! green—line it”
‘A David can take the train at any train station to go to NTU. However, he does not like to wait nor stand in a train. Let’s find a station for him so that E(waiting time)*Prob(standing in a train) is minimized. (a). When David waits in the main station, is the probability that there is an
empty seat still 0.5? If not, what is the probability that there is a seat left?
(2%) (b). When waiting in the main station, what is the probability that David will get an empty seat? (2%) (Note that we assume that none gets off the
train in the main station) (c). Show that the total number of train arrivals in one period of time is still
Poisson and the total rate is l per minute. (7%) (Hint: Use moment
generating function) (d). When waiting in the main station, what is the distribution of interarrival
time (i.e., the time between two consecutive train arrivals)? (6%) (Hint:
the time interval between two consecutive events that follow a Poisson
distribution follows an exponential distribution) (e). Given your above answers, which station should David choose to achieve his objective. (4%) 5. (14%) A semiconductor wafer has M VLSI chips on it and these chips have the same circuitry. Each VLSI chip consists of N interconnected transistors. A transistor may fail (not function properly) with a probability p because of its
fabrication process, which we assume to be independent among individual
transistors. A chip is considered a failure if there are n or more transistor failures. Let K be the number of failed transistors on a VLSI chip, which is therefore a random variable (R.V.). (al
(b). (c). (d). What is a random variable? (4%)
What is the sample space (also called outcome set) over which R.V. K is defined? (2%)
Let X) = 1 if a chip 1' fails and Xi = 0 if a chip i is good. Derive the probability that a chip is good, i.e., pg sPr {Xi = O} =? (3%) Whether one chip is good or fails is independent of other chips. Let the
yield of a wafer be defined as the percentage of good chips in the wafer, M
Le, Y=(1+A:ZXI)x100%. Then derive yys E[Y]=? (7%: Var(Y]=? i=1 (Hint: Utilize pg obtained from 5.iii)) (5%) 6. (24%) You are observing a radar signal sequence YR = 8 + wk, k=1, 2, 3, where 9 is an unknown constant, and 03k is N(u,oz) and independent and (a). (c). identical over time index k. Now you have one observation Y1 : x, where x is an observed value. And you set an estimate of 9 as ®=x  col. Derive fem (S l x) , E[® Ix] and Varl® Ix]. (8%)
Given N observations of Yk, how do you estimate the value of B ? (3%) Is your estimate biased or unbiased, why? (5%)
When N2100, propose an approximation method of your confidence (in terms of probability) that your estimate in (65015 within 0.26 from the true parameter E). Please explain quantitatively why. (8%) Random Variable PMF or PDF MGF (1)); (s)
1 — p X z 0
Bernoulli (p) PX(x] = p x = 1 1 — p + my
0 otherwise
Binomial (n, p) Pﬁxj : (n)px(1 — p)"_“‘ (1 — p + p23)"
)5
. It _ p(_1_p)x1 x=1,2,... pe’
Geometric (P) FHA)  l 0 Otherwisﬁ —‘——I _ (I k mes
_ __ x _ 1 k r i 1’83 I:
Pascal (imp) Pxil) — (k ‘1),0 (I P) (1_(1ﬁ mes) . _ _ axe“Ml x:0,1,2,... ode—L.” Poisson (a) PXU) — { 0 otherwise 2
1 3k s(i+l) . . : FM] x=k,k+l,...,l 6 E
DISC. Uniform (k, l) PXLI) { 0 otherwise 1 H es
Constant (n) fx(x) = 50; ~— a) 6” b5 as
. _ b—J a < :c < b 9 5'
mm)“ (‘1' '5) ﬁrm ‘ { 0 otherwise in; — a)
. lie—A" x 3 0 A
Exponential (A) fX (.1) z { 0 otherwise A _ S
Lnxn—le—Lr ’ > 0
'  = T51?!— J‘ — ( )"
Erlang (n, A) 151(1) [ 0 otherwise A _ S
' . i 1 —(x—y.)2/2i72 w+3202/2
Gaussmn (any) fxu) — age 8 Table 6. 1 Moment generating function for families of random variables. Deﬁnition 4.17 Bivariate Gaussian Random Variables
Mugamwmmmm Random variables X and Y have a bivariaie Gaussian PDF with ag,andpif rJ/U f3 Lin/y parameters ,Li], CF]. #2‘ 7
“ _ 2er—m)(i=~u2) + (ii—1L2)“ 0'10: 0': 3' 5
“(Fiﬁ/Des) in? EXP 2(1—p3) fXJU’aJF): ? ,
hfnvi){/if+f€s)n4, F6547 n(/'f9+f9€ Ewing/m  5 n f
) "/>€.
where in and [1.2 can be any real month: 5, a] > 0, 0'2 > 0, and mi < p < l. MW); [7 + 17,9 ~ 177»; ,2 In R 1 2. .‘a _ (D U 7‘71) ~23? ijl(./') I X and 1’ die ihe bivariaie Ga ssian random variables in Deﬁnition 4.1 7, X is the Gaussian
(1.1.1, 0'1) random variable and Y is the Gaussian (#2, 0'2) random variable.
1 2 2 I 1!
fx (1) : e—(x—uil x2e] f}, ( J) : fawnﬂag
(Ila/23F J Giza/271’
Thin9.": 4:29 a .mwzwuganm: PDF of Y given X is
— L 'U 1 — 2 w
‘ 2 .i. fY‘X (ylx) = enbhiuzﬁxn 352‘,
2 \  i} (bx/22: where, given X = x, the conditional expected value and variance of Y are
/‘*_ .z _ U
M2(XJ=LL2+,0(TZ(X*MIJ: fr§=a3(1—p2)
1 _ .Nymll .1...“ I. .h ...
View
Full Document
 Spring '07
 張捷力

Click to edit the document details