3
x 10
0
1
10
1
|e(z)|
e(z)
0
2
5
10
3
4
10
0
2
4
10
6
0
2
z
4
6
z
The left side plot graphs e(z) versus z. It appears that the e(z) = 0 for z 3. In fact, e(z)
is nonzero over that range, but the relative error is so small that it isnt visible in comparis
0
cfw_X<xcfw_X<Y<X+w
x +w
x
FX,W (x, w) =
Y
x
=
2 ey d y d x
ey
0
x
=
w
(3)
x
x +w
x
(4)
dx
e(x +w) + ex
dx
(5)
0
X
= e(x +w) ex
x
x
(6)
0
w
= (1 ex )(1 e
)
We see that FX,W (x, w) = FX (x)FW (w). Moreover, by applying Theorem 4.4,
f X,W (x, w) =
(7)
2 F
Problem 4.11.1 Solution
f X,Y (x, y) = ce(x
2 /8)(y 2 /18)
(1)
The omission of any limits for the PDF indicates that it is dened over all x and y. We know that
f X,Y (x, y) is in the form of the bivariate Gaussian distribution so we look to Denition 4.17
The event A has probability
Y
1
P [A] =
X>Y
f X,Y (x, y) d y d x
6x y 2 d y d x
=
0
X
1
(5)
(6)
x>y
1
x
0
1
2x 4 d x = 2/5
(7)
f X,Y (x,y)
P[A]
=
(8)
0
The conditional joint PDF of X and Y given A is
Y
1
f X,Y |A (x, y) =
(x, y) A
otherwise
15x y 2 0 y x
By Denition 4.3,
FX,Y (x, y) =
=
x
y
x
f X,Y (u, v) dv du
f X (u) du
y
(3)
f Y (v) dv
(4)
= FX (x) FX (x)
(5)
Problem 4.10.15 Solution
Random variables X and Y have joint PDF
f X,Y (x, y) =
2 ey 0 x y
0
otherwise
(1)
For W = Y X we can nd f W (w) by int
PY |X (1|1) $ Y =1
$
PX (1)
d PX (0)
d
d
d
PX (1)
d
d
$
X =1 $
Y =0
PY |X (0|1)
1/4
PY |X (1|0) Y =1
X =0
X =1
rr (0|0)
PY |X
rr
PY |X (1|0) r
r
P
(0|1)
Y |X
PY |X (1|1)
d
Y =0
Y =1
Y =0
3/4 $ Y =1
$
$
X =1
Y =0
1/4
1/2
d
d
d
1/4
d
d
1/3 Y =1
X =0
X =1
(b) Since X 1 and X 2 are independent, we can say that
P [X 1 1, X 2 1] = P [X 1 1] P [X 2 1] = FX 1 (1) FX 2 (1) = [FX (1)]2 =
1
16
(3)
(c) For W = max(X 1 , X 2 ),
FW (1) = P [max(X 1 , X 2 ) 1] = P [X 1 1, X 2 1]
(4)
Since X 1 and X 2 are independent,
Problem 4.10.9 Solution
Since X and Y are take on only integer values, W = X + Y is integer valued as well. Thus for an
integer w,
PW (w) = P [W = w] = P [X + Y = w] .
(1)
Suppose X = k, then W = w if and only if Y = w k. To nd all ways that X + Y = w, we
(b) Before we nd E[B], it will prove helpful to nd the marginal PMFs PB (b) and PM (m).
These can be found from the row and column sums of the table of the joint PMF
PB,M (b, m) m = 60 m = 180
b=1
0.3
0.2
0.1
0.2
b=2
0.1
0.1
b=3
PM (m)
0.5
0.5
PB (b)
0.5
The complete expression for the joint CDF is
1 ew wey 0 w y
0yw
1 (1 + y)ey
FW,Y (w, y) =
0
otherwise
(19)
Applying Theorem 4.4 yields
f W,Y (w, y) =
2 FW,Y (w, y)
=
w y
22 ey 0 w y
0
otherwise
(20)
The joint PDF f W,Y (w, y) doesnt factor and thus W an
(c) Since = 1/ 2, now we can solve for X and Y .
X = 1/ 2
Y = 1/2
(6)
(d) From here we can solve for c.
c=
1
2
(7)
2
2
X = Y = 1
(1)
2 X Y 1
2
=
(e) X and Y are dependent because = 0.
Problem 4.11.3 Solution
From the problem statement, we learn that
function covxy=finitecov(SX,SY,PXY);
%Usage: cxy=finitecov(SX,SY,PXY)
%returns the covariance of
0inite random variables X and Y
%given by grids SX, SY, and PXY
ex=finiteexp(SX,PXY);
ey=finiteexp(SY,PXY);
R=finiteexp(SX.*SY,PXY);
covxy=R-ex*ey;
The follow
(b) In this case, the joint PDF of X and Y is inversely proportional to the area of the target.
f X,Y (x, y) =
1/[ 502 ] x 2 + y 2 502
0
otherwise
(4)
The probability of a bullseye is
P [B] = P X 2 + Y 2 22 =
22
=
502
1
25
2
0.0016.
(5)
(c) In this ins
The marked integral equals 1 because for each value of x, it is the integral of a Gaussian PDF of
one variable over all possible values. In fact, it is the integral of the conditional PDF fY |X (y|x) over
all possible y. To complete the proof, we see that
p
$
$
$
$
1 p
$
q $ W >10
$
$
$
W 10
$
$ T >38 $
1q
T 38
The probability the person is ill is
P [I ] = P [T > 38, W > 10] = P [T > 38] P [W > 10] = pq = 0.0107.
(b) The general form of the bivariate Gaussian PDF is
2
w1
2(w1 )(t2 ) +
1
1 2
exp
2(1 2 )
f
Given T = t, the conditional probability the person is declared ill is
P [I |T = t] = P [W > 10|T = t]
W (7 + 2(t 37)
10 (7 + 2(t 37)
>
=P
2
2
3 2
3 2(t 37)
=Q
=P Z>
(t 37)
2
2
(11)
(12)
(13)
Problem 4.11.6 Solution
The given joint PDF is
f X,Y (x, y) =
(c) To nd the conditional PMF PD|N (d|2), we rst need to nd the probability of the conditioning event
(3)
PN (2) = PN ,D (2, 20) + PN ,D (2, 100) + PN ,D (2, 300) = 0.4
The conditional PMF of N D given N = 2 is
1/4
PN ,D (2, d) 1/2
=
PD|N (d|2) =
PN (2)
We can calculate the requested moments.
E [X ] = 3/4 0 + 1/4 20 = 5
(2)
Var[X ] = 3/4 (0 5) + 1/4 (20 5) = 75
2
2
E [X + Y ] = E [X ] + E [X ] = 2E [X ] = 10
(3)
(4)
Since X and Y are independent, Theorem 4.27 yields
Var[X + Y ] = Var[X ] + Var[Y ] = 2 Va
Problem 4.8.5 Solution
The joint PDF of X and Y is
(x + y)/3 0 x 1, 0 y 2
0
otherwise
f X,Y (x, y) =
(1)
(a) The probability that Y 1 is
P [A] = P [Y 1] =
Y
f X,Y (x, y) d x d y
y1
1
1
2
=
Y 1
1
1
0
1
=
X
0
0
1
=
0
(2)
x+y
dy dx
3
xy
y2
+
3
6
(3)
y=1
dx
(
Note that further along in the problem we will need E[N 2 |B] which we now calculate.
E N 2 |B = Var[N |B] + (E [N |B])2
17
2
+ 81
= 2+
p
p
(9)
(10)
For the conditional moments of K , we work directly with the conditional PMF PN ,K |B (n, k).
n
(1 p)n10 p
Problem 4.9.1 Solution
The main part of this problem is just interpreting the problem statement. No calculations are necessary. Since a trip is equally likely to last 2, 3 or 4 days,
PD (d) =
1/3 d = 2, 3, 4
0
otherwise
(1)
Given a trip lasts d days, the
Problem 4.8.3 Solution
Given the event A = cfw_X + Y 1, we wish to nd f X,Y |A (x, y). First we nd
1
P [A] =
1x
0
6e(2x+3y) d y d x = 1 3e2 + 2e3
(1)
0
So then
f X,Y |A (x, y) =
6e(2x+3y)
13e2 +2e3
x + y 1, x 0, y 0
otherwise
0
(2)
Problem 4.8.4 Solution
(a) The probability of event A = cfw_Y 1/2 is
P [A] =
1
f X,Y (x, y) d y d x =
y1/2
0
1/2
0
4x + 2y
d y d x.
3
(2)
With some calculus,
1
P [A] =
0
4x y + y 2
3
y=1/2
x2
x
2x + 1/4
dx =
+
3
3
12
1
dx =
0
y=0
1
=
0
5
.
12
(3)
(b) The conditional joint PDF o
(0, 1) random variable U , P[U = 1/4] = 0. Thus we can choose any w [3, 3]. In particular,
we dene the inverse CDF as
8u 5
0 u 1/4
(8u + 7)/3 1/4 < u 1
1
w = FW (u) =
(1)
1
Note that because 0 FW (w) 1, the inverse FW (u) is dened only for 0 u 1. Careful
(b) Similar results hold for Gaussian random variables. The following code generates the same
comparison between the Gaussian PDF and the relative frequency of n samples.
function gausstest(mu,sigma2,n)
delta=0.01;
x=gaussrv(mu,sigma2,n);
xr=(0:delta:(mu+
(b) Since PA (1) = PA,B (1, 0) + PA,B (1, 1) = 2/3,
PB|A (b|1) =
PA,B (1, b)
=
PA (1)
1/2 b = 0, 1,
0
otherwise.
(3)
If A = 1, the conditional expectation of B is
1
b PB|A (b|1) = PB|A (1|1) = 1/2.
E [B|A = 1] =
(4)
b=0
(c) Before nding the conditional PM