We conclude that Y1 and Y2 have covariance
2
2
Cov [Y1 , Y2 ] = CY (1, 2) = (1 2 ) sin cos .
(4)
Since Y1 and Y2 are jointly Gaussian, they are independent if and only if Cov[Y1 , Y2 ] = 0.
2
2
Thus,
(c) Y has correlation matrix
RY = CY + Y Y =
1 43 55
8
+
0
9 55 103
8 0 =
1 619 55
9 55 103
(6)
(d) From Y , we see that E[Y2 ] = 0. From the covariance matrix CY , we learn that Y2 has
2
variance 2 =
function w=wrv1(lambda,mu,m)
%Usage: w=wrv1(lambda,mu,m)
%Return m samples of W=Y/X
%X is exponential (lambda)
%Y is exponential (mu)
x=exponentialrv(lambda,m);
y=exponentialrv(mu,m);
w=y./x;
function
(b) For n = 3,
1 P min X i 3/4 = P min X i > 3/4
i
(5)
i
= P [X 1 > 3/4, X 2 > 3/4, X 3 > 3/4]
1
1
1
3/4
3/4
(6)
3/4
3
=
d x1 d x2 d x3
(7)
= (1 3/4) = 1/64
(8)
Thus P[mini X i 3/4] = 63/64.
Problem 5
Problem 5.3.7 Solution
(a) Note that Z is the number of three page faxes. In principle, we can sum the joint PMF
PX,Y,Z (x, y, z) over all x, y to nd PZ (z). However, it is better to realize that each
Problem 5.5.2 Solution
The random variable Jn is the number of times that message n is transmitted. Since each transmission is a success with probability p, independent of any other transmission, the
Problem 5.6.4 Solution
Inspection of the vector PDF f X (x) will show that X 1 , X 2 , X 3 , and X 4 are iid uniform (0, 1) random
variables. That is,
(1)
f X (x) = f X 1 (x1 ) f X 2 (x2 ) f X 3 (x3 )
Problem 5.6.9 Solution
Given an arbitrary random vector X, we can dene Y = X X so that
CX = E (X X )(X X ) = E YY = RY .
(1)
It follows that the covariance matrix CX is positive semi-denite if and onl
The condition Aw = 0 implies
A
1
ACX
Av+Av
=
0
.
0
(4)
This implies
AA v + AA v = 0
AC1 Av + AC1 A v = 0
X
(5)
(6)
X
Since AA = 0, Equation (5) implies that AA v = 0. Since A is rank m, AA is
Following similar steps, one can show that
f X 2 (x2 ) =
0
22x2
0
x2 0,
otherwise.
(6)
f X (x) d x1 d x2 =
33x3
0
x3 0,
otherwise.
(7)
0
f X 3 (x3 ) =
f X (x) d x1 d x3 =
0
0
Thus
f X (x) = f X 1 (x1
The covariance matrix of W is
CW = E (W W )(W W )
=E
X X
Y Y
(X X )
=
E (X X )(X X )
E (Y Y )(X X )
=
(2)
(Y Y )
(3)
E (X X )(Y Y )
E (Y Y )(Y Y )
(4)
CX CXY
.
CYX CY
(5)
The assumption that X and Y a
Problem Solutions Chapter 5
Problem 5.1.1 Solution
The repair of each laptop can be viewed as an independent trial with four possible outcomes corresponding to the four types of needed repairs.
(a) Si
Frequency
150
100
50
0
1.7076 1.7078 1.708 1.7082 1.7084 1.7086 1.7088 1.709 1.7092 1.7094 1.7096
7
J
x 10
If you go back and solve Problem 5.5.5, you will see that the jackpot J has expected value
E[
Problem 5.4.2 Solution
The random variables N1 , N2 , N3 and N4 are dependent. To see this we observe that PNi (4) = pi4 .
However,
4 4 4 4
PN1 ,N2 ,N3 ,N4 (4, 4, 4, 4) = 0 = p1 p2 p3 p4 = PN1 (4) PN2
The off-diagonal zero blocks are a consequence of Y1 Y2 being independent of Y3 Y4 . Along
the diagonal, the two identical sub-blocks occur because fY1 ,Y2 (x, y) = f Y3 ,Y4 (x, y). In short, the
matr
Problem Solutions Chapter 6
Problem 6.1.1 Solution
The random variable X 33 is a Bernoulli random variable that indicates the result of ip 33. The
PMF of X 33 is
1 p x =0
p
x =1
PX 33 (x) =
(1)
0
oth
This implies
1
E [Y1 ] = E [Y3 ] =
2y(1 y) dy = 1/3
(3)
2y 2 dy = 2/3
(4)
0
1
E [Y2 ] = E [Y4 ] =
0
Thus Y has expected value E[Y] = 1/3 2/3 1/3 2/3 . The second part of the problem is to
nd the corre
Finally, the probability that more laptops require motherboard repairs than keyboard repairs
is
P [N2 > N3 ] = PN2 ,N3 (1, 0) + PN2 ,N3 (2, 0) + PN2 ,N3 (2, 1) + PN2 (3) + PN2 (4)
(10)
where we use th
function p=sailboats(w,m)
%Usage: p=sailboats(f,m)
%In Problem 5.8.4, W is the
%winning time in a 10 boat race.
%We use m trials to estimate
%P[W<=w]
CX=(5*eye(10)+(20*ones(10,10);
mu=35*ones(10,1);
X
Problem 5.4.7 Solution
Since U1 , . . . , Un are iid uniform (0, 1) random variables,
1/T n 0 u i 1; i = 1, 2, . . . , n
0
otherwise
fU1 ,.,Un (u 1 , . . . , u n ) =
(1)
Since U1 , . . . , Un are cont
Problem 5.3.6 Solution
In Example 5.1, random variables N1 , . . . , Nr have the multinomial distribution
PN1 ,.,Nr (n 1 , . . . , n r ) =
n
n
p n 1 pr r
n 1 , . . . , nr 1
(1)
where n > r > 2.
(a) To
The same recursion will also allow us to show that
1
2
(14)
E J2 = (3/2)8 1012 +
(3/2)6 + (3/2)5 + (3/2)4 + (3/2)3 106
4
1
2
(15)
E J1 = (3/2)10 1012 +
(3/2)8 + (3/2)7 + (3/2)6 + (3/2)5 + (3/2)4 106
4
The complete expression for the joint PDF of Y1 and Y2 is
f Y1 ,Y2 (y1 , y2 ) =
12(1 y2 )2 0 y1 y2 1
0
otherwise
(9)
For 0 y1 1, the marginal PDF of Y1 can be found from
f Y1 (y1 ) =
1
f Y1 ,Y2 (y1 ,
function err=poissonsigma(a,k);
xmin=max(0,floor(a-k*sqrt(a);
xmax=a+ceil(k*sqrt(a);
sx=xmin:xmax;
logfacts =cumsum([0,log(1:xmax)]);
%logfacts includes 0 in case xmin=0
%Now we extract needed values:
Given f X (x) with c = 2/3 and a1 = a2 = a3 = 1 in Problem 5.2.2, nd the
marginal PDF f X 3 (x3 ).
Filling in the parameters in Problem 5.2.2, we obtain the vector PDF
2
(x
3 1
f X (x) =
0
+ x2 + x3 )
The above expression may seem unwieldy and it isnt even clear that it will sum to 1. To
simplify the expression, we observe that
PX,Y (x, y) = PX,Y,Z (x, y, 5 x y) = PX,Y |Z (x, y|5 x + y) PZ (5 x y)
function x=bigpoissonrv(alpha)
0or vector alpha, returns a vector x such that
% x(i) is a Poisson (alpha(i) rv
et up Poisson CDF from xmin to xmax for each alpha(i)
alpha=alpha(:);
amin=min(alpha(:);
Problem 5.6.1 Solution
(a) The coavariance matrix of X = X 1
X 2 is
4 3
Cov [X 1 , X 2 ]
Var[X 1 ]
=
.
3 9
Var[X 2 ]
Cov [X 1 , X 2 ]
CX =
(1)
Y1
1 2
=
X = AX.
Y2
3 4
(2)
(b) From the problem statemen