Austin Wen
STAT4260 HW9
7.1 Suppose that a home mortgage company has N mortgages numbered serially in the
order that they were granted over a period of 20 years. There is a generally increasing trend in
the unpaid balances because of the rising cost of ho
Note that finiteexp performs its calculations on the sample values sx and probabilities px
using the column vectors sx(:) and px(:). As a result, we can use the same finiteexp
function when the random variable is represented by grid variables. We can buil
Problem 4.10.9 Solution
Since X and Y are take on only integer values, W = X + Y is integer valued as well. Thus for an
integer w,
PW (w) = P [W = w] = P [X + Y = w] .
(1)
Suppose X = k, then W = w if and only if Y = w k. To nd all ways that X + Y = w, we
Problem 4.10.6 Solution
We will solve this problem when the probability of heads is p. For the fair coin, p = 1/2. The
number X 1 of ips until the rst heads and the number X 2 of additional ips for the second heads
both have the geometric PMF
(1 p)x1 p x
PY |X (1|1) $ Y =1
$
PX (1)
d PX (0)
d
d
d
PX (1)
d
d
$
X =1 $
Y =0
PY |X (0|1)
1/4
PY |X (1|0) Y =1
X =0
X =1
rr (0|0)
PY |X
rr
PY |X (1|0) r
r
P
(0|1)
Y |X
PY |X (1|1)
d
Y =0
Y =1
Y =0
3/4 $ Y =1
$
$
X =1
Y =0
1/4
1/2
d
d
d
1/4
d
d
1/3 Y =1
X =0
X =1
Since the trials needed to generate successes 2 though 100 are independent of the trials that yield
the rst success, N and T are independent. Hence
PN |T (n|t) = PN |T (n t|t) = PN (n t)
(3)
Applying the PMF of N found above, we have
n1
98
PN |T (n|t) =
0
We can calculate the requested moments.
E [X ] = 3/4 0 + 1/4 20 = 5
(2)
Var[X ] = 3/4 (0 5) + 1/4 (20 5) = 75
2
2
E [X + Y ] = E [X ] + E [X ] = 2E [X ] = 10
(3)
(4)
Since X and Y are independent, Theorem 4.27 yields
Var[X + Y ] = Var[X ] + Var[Y ] = 2 Va
The complete expressions for the marginal PMFs are
(1 p)m1 p m = 1, 2, . . .
0
otherwise
PM (m) =
(n 1)(1 p)n2 p 2 n = 2, 3, . . .
0
otherwise
PN (n) =
(9)
(10)
Not surprisingly, if we view each voice call as a successful Bernoulli trial, M has a geometri
(c) It is possible to nd the marginal PMFs by summing the joint PMF. However, it is much
easier to obtain the marginal PMFs by consideration of the experiment. Specically, when a
bus arrives, it is boarded with probability q. Moreover, the experiment ends
Problem 4.9.10 Solution
This problem is fairly easy when we use conditional PMFs. In particular, given that N = n pizzas
were sold before noon, each of those pizzas has mushrooms with probability 1/3. The conditional
PMF of M given N is the binomial distr
Problem 4.9.12 Solution
We are given that the joint PDF of X and Y is
1/(r 2 ) 0 x 2 + y 2 r 2
0
otherwise
f X,Y (x, y) =
(1)
(a) The marginal PDF of X is
r 2 x 2
f X (x) = 2
0
1
dy =
r 2
2
0
r 2 x 2
r 2
r x r
otherwise
(2)
The conditional PDF of Y given
Problem 4.9.5 Solution
Random variables X and Y have joint PDF
Y
1
2 0yx 1
0 otherwise
f X,Y (x, y) =
1
(1)
X
For 0 x 1, the marginal PDF for X satises
f X (x) =
x
f X,Y (x, y) dy =
2 dy = 2x
(2)
0
Note that f X (x) = 0 for x < 0 or x > 1. Hence the compl
Problem 4.9.3 Solution
(x + y) 0 x, y 1
0
otherwise
f X,Y (x, y) =
(1)
(a) The conditional PDF f X |Y (x|y) is dened for all y such that 0 y 1. For 0 y 1,
f X |Y (x) =
f X,Y (x, y)
=
f X (x)
(x + y)
1
0 (x
=
+ y) dy
(x+y)
x+1/2
0
0x 1
otherwise
(2)
(b) Th
Problem 4.9.8 Solution
First we need to nd the conditional expectations
1
E [B|A = 1] =
b PB|A (b| 1) = 0(1/3) + 1(2/3) = 2/3
(1)
b PB|A (b|1) = 0(1/2) + 1(1/2) = 1/2
(2)
b=0
1
E [B|A = 1] =
b=0
Keep in mind that E[B|A] is a random variable that is a func
(b) Since PA (1) = PA,B (1, 0) + PA,B (1, 1) = 2/3,
PB|A (b|1) =
PA,B (1, b)
=
PA (1)
1/2 b = 0, 1,
0
otherwise.
(3)
If A = 1, the conditional expectation of B is
1
b PB|A (b|1) = PB|A (1|1) = 1/2.
E [B|A = 1] =
(4)
b=0
(c) Before nding the conditional PM
Problem 4.9.1 Solution
The main part of this problem is just interpreting the problem statement. No calculations are necessary. Since a trip is equally likely to last 2, 3 or 4 days,
PD (d) =
1/3 d = 2, 3, 4
0
otherwise
(1)
Given a trip lasts d days, the
(b) Before we nd E[B], it will prove helpful to nd the marginal PMFs PB (b) and PM (m).
These can be found from the row and column sums of the table of the joint PMF
PB,M (b, m) m = 60 m = 180
b=1
0.3
0.2
0.1
0.2
b=2
0.1
0.1
b=3
PM (m)
0.5
0.5
PB (b)
0.5
(c) To nd the conditional PMF PD|N (d|2), we rst need to nd the probability of the conditioning event
(3)
PN (2) = PN ,D (2, 20) + PN ,D (2, 100) + PN ,D (2, 300) = 0.4
The conditional PMF of N D given N = 2 is
1/4
PN ,D (2, d) 1/2
=
PD|N (d|2) =
PN (2)
function covxy=finitecov(SX,SY,PXY);
%Usage: cxy=finitecov(SX,SY,PXY)
%returns the covariance of
0inite random variables X and Y
%given by grids SX, SY, and PXY
ex=finiteexp(SX,PXY);
ey=finiteexp(SY,PXY);
R=finiteexp(SX.*SY,PXY);
covxy=R-ex*ey;
The follow
Problem 4.11.1 Solution
f X,Y (x, y) = ce(x
2 /8)(y 2 /18)
(1)
The omission of any limits for the PDF indicates that it is dened over all x and y. We know that
f X,Y (x, y) is in the form of the bivariate Gaussian distribution so we look to Denition 4.17
The marked integral equals 1 because for each value of x, it is the integral of a Gaussian PDF of
one variable over all possible values. In fact, it is the integral of the conditional PDF fY |X (y|x) over
all possible y. To complete the proof, we see that
Given T = t, the conditional probability the person is declared ill is
P [I |T = t] = P [W > 10|T = t]
W (7 + 2(t 37)
10 (7 + 2(t 37)
>
=P
2
2
3 2
3 2(t 37)
=Q
=P Z>
(t 37)
2
2
(11)
(12)
(13)
Problem 4.11.6 Solution
The given joint PDF is
f X,Y (x, y) =
(b) In this case, the joint PDF of X and Y is inversely proportional to the area of the target.
f X,Y (x, y) =
1/[ 502 ] x 2 + y 2 502
0
otherwise
(4)
The probability of a bullseye is
P [B] = P X 2 + Y 2 22 =
22
=
502
1
25
2
0.0016.
(5)
(c) In this ins
p
$
$
$
$
1 p
$
q $ W >10
$
$
$
W 10
$
$ T >38 $
1q
T 38
The probability the person is ill is
P [I ] = P [T > 38, W > 10] = P [T > 38] P [W > 10] = pq = 0.0107.
(b) The general form of the bivariate Gaussian PDF is
2
w1
2(w1 )(t2 ) +
1
1 2
exp
2(1 2 )
f
0
cfw_X<xcfw_X<Y<X+w
x +w
x
FX,W (x, w) =
Y
x
=
2 ey d y d x
ey
0
x
=
w
(3)
x
x +w
x
(4)
dx
e(x +w) + ex
dx
(5)
0
X
= e(x +w) ex
x
x
(6)
0
w
= (1 ex )(1 e
)
We see that FX,W (x, w) = FX (x)FW (w). Moreover, by applying Theorem 4.4,
f X,W (x, w) =
(7)
2 F
(c) Since = 1/ 2, now we can solve for X and Y .
X = 1/ 2
Y = 1/2
(6)
(d) From here we can solve for c.
c=
1
2
(7)
2
2
X = Y = 1
(1)
2 X Y 1
2
=
(e) X and Y are dependent because = 0.
Problem 4.11.3 Solution
From the problem statement, we learn that
The complete expression for the joint CDF is
1 ew wey 0 w y
0yw
1 (1 + y)ey
FW,Y (w, y) =
0
otherwise
(19)
Applying Theorem 4.4 yields
f W,Y (w, y) =
2 FW,Y (w, y)
=
w y
22 ey 0 w y
0
otherwise
(20)
The joint PDF f W,Y (w, y) doesnt factor and thus W an