We conclude that Y1 and Y2 have covariance
2
2
Cov [Y1 , Y2 ] = CY (1, 2) = (1 2 ) sin cos .
(4)
Since Y1 and Y2 are jointly Gaussian, they are independent if and only if Cov[Y1 , Y2 ] = 0.
2
2
Thus, Y1 and Y2 are independent for all if and only if 1 = 2
(c) Y has correlation matrix
RY = CY + Y Y =
1 43 55
8
+
0
9 55 103
8 0 =
1 619 55
9 55 103
(6)
(d) From Y , we see that E[Y2 ] = 0. From the covariance matrix CY , we learn that Y2 has
2
variance 2 = CY (2, 2) = 103/9. Since Y2 is a Gaussian random varia
function w=wrv1(lambda,mu,m)
%Usage: w=wrv1(lambda,mu,m)
%Return m samples of W=Y/X
%X is exponential (lambda)
%Y is exponential (mu)
x=exponentialrv(lambda,m);
y=exponentialrv(mu,m);
w=y./x;
function w=wrv2(lambda,mu,m)
%Usage: w=wrv1(lambda,mu,m)
%Retur
(b) For n = 3,
1 P min X i 3/4 = P min X i > 3/4
i
(5)
i
= P [X 1 > 3/4, X 2 > 3/4, X 3 > 3/4]
1
1
1
3/4
3/4
(6)
3/4
3
=
d x1 d x2 d x3
(7)
= (1 3/4) = 1/64
(8)
Thus P[mini X i 3/4] = 63/64.
Problem 5.2.1 Solution
This problem is very simple. In terms of
Problem 5.3.7 Solution
(a) Note that Z is the number of three page faxes. In principle, we can sum the joint PMF
PX,Y,Z (x, y, z) over all x, y to nd PZ (z). However, it is better to realize that each fax has 3
pages with probability 1/6, independent of a
Problem 5.5.2 Solution
The random variable Jn is the number of times that message n is transmitted. Since each transmission is a success with probability p, independent of any other transmission, the number of transmissions of message n is independent of
Problem 5.6.4 Solution
Inspection of the vector PDF f X (x) will show that X 1 , X 2 , X 3 , and X 4 are iid uniform (0, 1) random
variables. That is,
(1)
f X (x) = f X 1 (x1 ) f X 2 (x2 ) f X 3 (x3 ) f X 4 (x4 )
where each X i has the uniform (0, 1) PDF
Problem 5.6.9 Solution
Given an arbitrary random vector X, we can dene Y = X X so that
CX = E (X X )(X X ) = E YY = RY .
(1)
It follows that the covariance matrix CX is positive semi-denite if and only if the correlation matrix
RY is positive semi-denite.
The condition Aw = 0 implies
A
1
ACX
Av+Av
=
0
.
0
(4)
This implies
AA v + AA v = 0
AC1 Av + AC1 A v = 0
X
(5)
(6)
X
Since AA = 0, Equation (5) implies that AA v = 0. Since A is rank m, AA is an m m
rank m matrix. It follows that v = 0. We can the
Following similar steps, one can show that
f X 2 (x2 ) =
0
22x2
0
x2 0,
otherwise.
(6)
f X (x) d x1 d x2 =
33x3
0
x3 0,
otherwise.
(7)
0
f X 3 (x3 ) =
f X (x) d x1 d x3 =
0
0
Thus
f X (x) = f X 1 (x1 ) f X 2 (x2 ) f X 3 (x3 ) .
(8)
We conclude that X 1 ,
The covariance matrix of W is
CW = E (W W )(W W )
=E
X X
Y Y
(X X )
=
E (X X )(X X )
E (Y Y )(X X )
=
(2)
(Y Y )
(3)
E (X X )(Y Y )
E (Y Y )(Y Y )
(4)
CX CXY
.
CYX CY
(5)
The assumption that X and Y are independent implies that
CXY = E (X X )(Y Y ) = (E (
Problem Solutions Chapter 5
Problem 5.1.1 Solution
The repair of each laptop can be viewed as an independent trial with four possible outcomes corresponding to the four types of needed repairs.
(a) Since the four types of repairs are mutually exclusive ch
Frequency
150
100
50
0
1.7076 1.7078 1.708 1.7082 1.7084 1.7086 1.7088 1.709 1.7092 1.7094 1.7096
7
J
x 10
If you go back and solve Problem 5.5.5, you will see that the jackpot J has expected value
E[J ] = (3/2)7 106 = 1.70859 107 dollars. Thus it is not
Problem 5.4.2 Solution
The random variables N1 , N2 , N3 and N4 are dependent. To see this we observe that PNi (4) = pi4 .
However,
4 4 4 4
PN1 ,N2 ,N3 ,N4 (4, 4, 4, 4) = 0 = p1 p2 p3 p4 = PN1 (4) PN2 (4) PN3 (4) PN4 (4) .
(1)
Problem 5.4.3 Solution
We wi
The off-diagonal zero blocks are a consequence of Y1 Y2 being independent of Y3 Y4 . Along
the diagonal, the two identical sub-blocks occur because fY1 ,Y2 (x, y) = f Y3 ,Y4 (x, y). In short, the
matrix structure is the result of Y1 Y2 and Y3 Y4 being iid
Problem Solutions Chapter 6
Problem 6.1.1 Solution
The random variable X 33 is a Bernoulli random variable that indicates the result of ip 33. The
PMF of X 33 is
1 p x =0
p
x =1
PX 33 (x) =
(1)
0
otherwise
Note that each X i has expected value E[X ] = p
This implies
1
E [Y1 ] = E [Y3 ] =
2y(1 y) dy = 1/3
(3)
2y 2 dy = 2/3
(4)
0
1
E [Y2 ] = E [Y4 ] =
0
Thus Y has expected value E[Y] = 1/3 2/3 1/3 2/3 . The second part of the problem is to
nd the correlation matrix RY . In fact, we need to nd RY (i, j) = E
Finally, the probability that more laptops require motherboard repairs than keyboard repairs
is
P [N2 > N3 ] = PN2 ,N3 (1, 0) + PN2 ,N3 (2, 0) + PN2 ,N3 (2, 1) + PN2 (3) + PN2 (4)
(10)
where we use the fact that if N2 = 3 or N2 = 4, then we must have N2 >
function p=sailboats(w,m)
%Usage: p=sailboats(f,m)
%In Problem 5.8.4, W is the
%winning time in a 10 boat race.
%We use m trials to estimate
%P[W<=w]
CX=(5*eye(10)+(20*ones(10,10);
mu=35*ones(10,1);
X=gaussvector(mu,CX,m);
W=min(X);
p=sum(W<=w)/m;
> sailb
Problem 5.4.7 Solution
Since U1 , . . . , Un are iid uniform (0, 1) random variables,
1/T n 0 u i 1; i = 1, 2, . . . , n
0
otherwise
fU1 ,.,Un (u 1 , . . . , u n ) =
(1)
Since U1 , . . . , Un are continuous, P[Ui = U j ] = 0 for all i = j. For the same re
Problem 5.3.6 Solution
In Example 5.1, random variables N1 , . . . , Nr have the multinomial distribution
PN1 ,.,Nr (n 1 , . . . , n r ) =
n
n
p n 1 pr r
n 1 , . . . , nr 1
(1)
where n > r > 2.
(a) To evaluate the joint PMF of N1 and N2 , we dene a new ex
The complete expression for the joint PDF of Y1 and Y2 is
f Y1 ,Y2 (y1 , y2 ) =
12(1 y2 )2 0 y1 y2 1
0
otherwise
(9)
For 0 y1 1, the marginal PDF of Y1 can be found from
f Y1 (y1 ) =
1
f Y1 ,Y2 (y1 , y2 ) dy2 =
12(1 y2 )2 dy2 = 4(1 y1 )3
(10)
y1
The compl
function err=poissonsigma(a,k);
xmin=max(0,floor(a-k*sqrt(a);
xmax=a+ceil(k*sqrt(a);
sx=xmin:xmax;
logfacts =cumsum([0,log(1:xmax)]);
%logfacts includes 0 in case xmin=0
%Now we extract needed values:
logfacts=logfacts(sx+1);
%pmf(i,:) is a Poisson a(i) P
Given f X (x) with c = 2/3 and a1 = a2 = a3 = 1 in Problem 5.2.2, nd the
marginal PDF f X 3 (x3 ).
Filling in the parameters in Problem 5.2.2, we obtain the vector PDF
2
(x
3 1
f X (x) =
0
+ x2 + x3 ) 0 x1 , x2 , x3 1
otherwise
(1)
In this case, for 0 x3
The above expression may seem unwieldy and it isnt even clear that it will sum to 1. To
simplify the expression, we observe that
PX,Y (x, y) = PX,Y,Z (x, y, 5 x y) = PX,Y |Z (x, y|5 x + y) PZ (5 x y)
(7)
Using PZ (z) found in part (c), we can calculate PX
function x=bigpoissonrv(alpha)
0or vector alpha, returns a vector x such that
% x(i) is a Poisson (alpha(i) rv
et up Poisson CDF from xmin to xmax for each alpha(i)
alpha=alpha(:);
amin=min(alpha(:);
amax=max(alpha(:);
%Assume Poisson PMF is negligible +-
Problem 5.6.1 Solution
(a) The coavariance matrix of X = X 1
X 2 is
4 3
Cov [X 1 , X 2 ]
Var[X 1 ]
=
.
3 9
Var[X 2 ]
Cov [X 1 , X 2 ]
CX =
(1)
Y1
1 2
=
X = AX.
Y2
3 4
(2)
(b) From the problem statement,
Y=
By Theorem 5.13, Y has covariance matrix
CY = ACX