This preview shows page 1. Sign up to view the full content.
Unformatted text preview: Statistics 265
Elements of Probability Theory
Fall Term 2000
Assignment 5
Due: Tuesday December 5, 2000
Solutions 5.94 A target for a bomb is in the center of a circle with radius 1 mile. A bomb falls at a randomly selected
1
point inside that circle. If the bomb destroys everything within 2 mile of its landing point, what is the
probability that the target is destroyed?
Solution: Assume that the circle has its center at the origin and has radius 1 then a point (y1 y2 ) in
2
2
the plane is inside the circle if and only if its coordinates satisfy the inequality y1 + y2 1:
Let (Y1 Y2) be the point in the plane where the bomb lands, so Y1 and Y2 are random variables. We
want to calculate the probability that the target is destroyed. This is the same as the probability that
the landing point (Y1 Y2) is within a distance of 1 from the target. Therefore, we want the following
2
probability
;Y 2 + Y 2 1 :
P1 2 4 ; 12
Area of shaded
2
Graphically, P (bomb falls within shaded circle) = Area of outside circle = (1)2 = 1 :
circle
4
5.95 Two friends are to meet at the library. Each independently and randomly selects an arrival time within
the same 1hour period. Each agrees to wait a maximum of 10 minutes for the other to arrive. What is
the probability that they will meet?
Solution: Let Y1 = arrival time for the rst friend, 0
y1 1 and Y2 = arrival time for the second
friend, 0 y2 1: We assume that Y1 and Y2 are jointly uniformly distributed over the square 0 1] 0 1]
so the joint density function of Y1 and Y2 is given by
0 y1 1 0 y2 1
f (y1 y2 ) = 1
0
otherwise.
1
If the second friend arrives 10 minutes ( 6 th of an hour) before or 10 minutes ( 1 th of an hour) after the
6
rst friend arrives, then they will meet, and this is the event (jY1 ; Y2 j < 1 ): Therefore, the probability
6
that they meet is given by
ZZ
f (y1 y2) dy1 dy2
P (jY1 ; Y2 j < 1 ) =
6
E 1
where the region of integration E in the y1 y2 plane is the region f(y1 y2) j jy1 ; y2 j < 6 g which is shaded
in the gure below.
y
2
5/6
1
y = y + 1/6
2
1 1/6
0 5/6 y = y  1/6
2
1
1/6 1 y
1 Since the density function is nonzero only on the square 0 1] P (jY1 ; Y2j < ) =
1
6 = 1
Z 6Z
0 y1 +1
6 0 dy2dy1 + Z 5Z
6
1
6 0 1] then we have
+1
6 y1
y1 1
;6 dy2 dy1 + Z 1Z 1
5
6 y1 ;1
6 dy2dy1 1
Z 6;
Z 5;
Z1 ;
6
y1 + 1 dy1 + 1 1 dy1 + 5 7 ; y1 dy1
6
3
6
0 6 1
6 2
= y21 + y61 0 6 5
6 + y31 + 7y1 ; y21
1 1 2 5
6 6 22 11
= 72 = 36 : 5.96 A committee of three people is to be randomly selected from a group containing four Republicans, three
Democrats, and two independents. Let Y1 and Y2 denote the numbers of Republicans and Democrats,
respectively, on the committee.
(a) What is the joint probability distribution for Y1 and Y2?
(b) Find the marginal distributions of Y1 and Y2 :
(c) Find P (Y1 = 1 j Y2 1):
Solution: In this exercise Y1 and Y2 are both discrete random variables.
(a) The joint probability mass function for Y1 and Y2 is given by ;4 ;3 ; 2
;93; ;
f (y1 y2) = P ((Y1 = y1 ) \ (Y2 = y2 )) =
y1 y1 y2 y2 3 for y1 = 0 1 2 3 y2 = 0 1 2 3 0 y1 + y2 3:
(b) The marginal probability function for Y1 is given by f1 (y1 ) = X
y2 Recalling the identity ;4 X
X ; 4 ; 3 ;3; 2;
;3 ; 2 :
;9
f (y1 y2) =
= ;9
3; ;
y1 y2 y1 y2 3 y2 X; ;
a
k
k 3 ;=
b r ;+
a k b r for nonnegative integers a b and r we have ;4 ; 5
f1 (y1 ) = ;93;
y1 y1 3 for y1 = 0 1 2 3: 2 y1 y2 y2 y1 y2 The marginal probability function for Y2 is given by f2 (y2 ) = X
y1 ;3 X
X ; 4 ; 3 ;3; 2;
;4 ; 2 :
;9
= ;9
f (y1 y2) =
3; ;
y1 y2 y1 y2 3 y1 y2 3 y1 y1 y1 Using the same identity as before, we have ;3 ; 6
f2 (y2 ) = ;93;
y2 y2 3 for y2 = 0 1 2 3:
(c) The conditional probability is given by
=
Y
(1 2
P (Y1 = 1 j Y2 1) = P ((Y1 P (1) \ (1)2 1)) = f1(1 1) (+ f = 0)) :
Y2
; P Y2
Now,
and
so that ;4 ;3 ;2 + ;4 ;3 ;2
+
f (1 1) + f (1 2) = 1 1 1 ;9 1 2 0 = 24 84 12 = 36
84
3
;6
64
1 ; P (Y2 = 0) = 1 ; f2 (0) = 1 ; ;3 = 1 ; 20 = 84
9
84
3 9
P (Y1 = 1 j Y2 1) = 36 = 16 :
64 3 y2 5.97 Let Y1 and Y2 have a joint density function given by f (y1 y2) = 3y1
0
(a)
(b)
(c)
(d) 0 y2 y1 1
elsewhere. Find the marginal density functions of Y1 and Y2:
3
Find P (Y1 4 Y2 1 ):
2
Find the conditional density function of Y1 given Y2 = y2 :
3
Find P (Y1 4 Y2 = 1 ):
2 Solution: (a) The marginal density function of Y1 is given by f1 (y1 ) =
If 0 y1 1 then Z f1 (y1 ) = y1 0 Z1
;1 1 f1 (y1 ) = Z1
;1 if y1 < 0 or y1 > 1:
The marginal density function of Y2 is given by f2 (y2 ) =
f2 (y2 ) = Z1
;1 2
= 3y1 y
1 0dy2 = 0 f (y1 y2 )dy1: Z1 2
3y1 dy1 = 3 y21
2 y y
2
1 1
y2 2
= 3 (1 ; y2 )
2 y =y
2
1 0 and 0 y =y
2
1 0 If 0 y2 1 then y1 3y1dy2 = 3y1 (y2 ) y
2
1 and f (y1 y2 )dy2: 1 f2 (y2 ) = Z1 if y2 < 0 or y2 > 1:
4 ;1 y
1 0dy1 = 0 (b) From the de nition of the conditional probability, we have ;
PY
and 3
4 1 ; P Y1 3
4 Y2 Y2 1
2 1
2 ; ;; P Y1 ;3 \ Y2
4
=
P Y2 1
2
= 1
Z2 Z y1 3y1 dy2 dy1 + 00 1
2 42
Z3 Z1
10
2 y
2
1 ; P Y1 ; 3 Y2
4
=
P Y2 1
2 1
2 15
3y1 dy2dy1 = 1 + 64 = 23
8
64 y =y
2
1 1/2 1/2 3/4 0 while ;
PY 2 1
2 = ; P Y1 3
4 y1 2
Z1 3
0 and therefore, 1 2
(1 ; y2 )dy2 = 11
2
16 Y2 1
2 = 23
64
11
16 = 23 :
44 (c) The conditional density function of Y1 given Y2 = y2 is given by 8 f (y 1 y 2 )
>
<
f (y1 j y2 ) = > f2 (y2 )
:
0 if f2 (y2 ) > 0
otherwise, and f2 (y2 ) > 0 if and only if 0 y2 < 1 and in this case y
f (y1 j y2 ) = 3 (13;1 y2 ) = 1 2y1y2
;2
2
2
for 0 y2 y1 1:
(d) Using the conditional density function of Y1 given Y2 = 1 we have
2 ;
PY 1 3
4 3
4
4
Z3 ;
Z3 2y1
2
8y1 4 = 5 :
1=
1 dy =
;1 ; 1 dy1 = 6 1 12
Y2 = 2
f y1 j y2 = 2 1
4
2
1
1
2 2 5 5.98 Refer to Exercise 5.97.
(a) Find E (Y2 j Y1 = y1 ):
(b) Use Theorem 5.14 to nd E (Y2 ):
(c) Find E (Y2 ) directly from the marginal density of Y2 :
Solution: (a) The conditional expectation of Y2 given Y1 = y1 is given by ;Y Y = y = Z 1 y f (y j y )dy :
E21 1
2
21
2
;1 Now, f1 (y1 ) > 0 if and only if 0 < y1 1 and in this case
(2
f (y2 j y1) = ffy1y y) ) = 3yy21 = y1
1( 1
1
1
so that ; E Y2 Y1 = y1 = Z
0 y1 2
y2 y1 dy2 = 2yy2
1
1 y1 0 = y21 : (b) Using Theorem 5.14, we have Z 1 y1
;;
y4 1 3
2
E (Y2) = E E Y2 Y1 = E Y1 =
3y1 dy1 = 381 = 8 :
2
2
0
0 (c) Using the marginal density of Y2 we have E (Y2 ) = Z1
0 Z 1 3y2 3y3
;2
y2 y4
y2 3 1 ; y2 dy2 =
; 22 dy2 = 342 ; 382
2
2
0 6 1
0 = 3:
8 5.100 In the production of a certain type of copper, two types of copper powder (types A and B ) are mixed
together and sintered (heated) for a certain length of time. For a xed volume of sintered copper, the
producer measures the proportion Y1 of the volume due to solid copper (some pores will have to be
lled with air) and the proportion Y2 of the solid mass due to type A crystals. Assume that appropriate
probability densities for Y1 and Y2 are
0 y1 1
f1 (y1 ) = 6y1 (1 ; y1 )
0
elsewhere
2
3y2
0 y2 1
f2 (y2 ) =
0
elsewhere.
The proportion of the sample volume due to type A crystals is then Y1Y2 : Assuming that Y1 and Y2 are
independent, nd P (Y1Y2 0:5):
Solution: The region in which y1 y2
0:5 is shaded in the gure below, it is easier to calculate the area of
the unshaded portion, that is, to calculate P (Y1Y2 > 0:5) and obtain P (Y1Y2 0:5) = 1 ; P (Y1Y2 > 0:5):
y
2
1
y y = 0.5
12
1/2 0 P (Y1 Y2 > 0:5) = 18 Z1 Z1 0:5 0:5 1/2 1 y1 Z1 1 1 1
2
(y1 ; y )y dy1dy2 = 18 6 y2 ; 8 + 24y dy2
2
2
1 2
2 1
2 y2 Z1 0 y2 1 y3 1 1
Z1
2
2
@ 1 1 ; 1 1 A y2 dy2 = 18 1 ; 12 ; 1 + 1 3 y2 dy2
= 18
22
32
2 8y2 3 24y2
1
1
2 y2 y2 2 Z1 1 1 1
y
2
3
= 18
y2 ; 8 + 24y dy2 = y2 ; 942 + 3 log y2
6
4
2
1
2 3
= ; 1 + 4 log2
4
so that 3
P (Y1 Y2 0:5) = 1 + 1 ; 4 log 2 = 5 ; 3 log 2
4
44
here log2 is the natural logarithm of 2. 7 1
1
2 5.101 Suppose that the number of eggs laid by a certain insect has a Poisson distribution with mean : The
probability that any one egg hatches is p: Assume that the eggs hatch independently of one another.
(a) Find the expected value of Y the total number of eggs that hatch.
(b) Find the variance of Y:
Solution: (a) Let X be the number of eggs laid by an insect, and let Y be the number of eggs hatched. Given that
X eggs were laid, Y has a binomial distribution with p = P (egg hatched): Therefore, ; E Y X = x = px:
Since Y has a Poisson distribution with parameter then E (Y ) = E (pX ) = pE (X ) = p :
(b) We have ; var Y X = x = xp(1 ; p)
;Y X = x = px therefore
and from part (a), E ;; ;; var(Y ) = E var Y X + var E Y X = E (Xp(1 ; p)) + var(Xp) :
Since E (X ) = and var(X ) = then var(Y ) = p(1 ; p)E (X ) + p2var(X ) = p ; p2 + p2 = p : 8 5.102 In a clinical study of a new drug formulated to reduce the e ects of rheumatoid arthritis, researchers found
that the proportion p of patients who respond favorably to the drug is a random variable that varies from
batch to batch of the drug. Assume that p has a probability density function given by
2
0p1
f (p) = 12p (1 ; p)
0
elsewhere.
Suppose that n patients are injected with portions of the drug taken from the same batch. Let Y denote
the number showing a favorable response.
(a) Find the unconditional probability distribution of Y for general n:
(b) Find E (Y ) for n = 2:
Solution: The conditional distribution of Y given p is a binomial distribution with parameter p and
the marginal distribution of p is given.
(a) The unconditional probability mass function for Y is given by
p f (y) =
Y Z1
0 f (y p)dp = for y = 0 1 2 : : : n that is Z1
0 Z1 n
;
2
f (y j p)f (p)dp =
y p (1 ; p) 12p (1 ; p)dp
y p n y 0 n Z 1 p2+ (1 ; p) ; +1 dp
f (y) = 12 y
0
y Y n y for y = 0 1 2 : : : n: The integrand is the density function of a beta random variable with parameters
= 3 + y and = n ; y + 2 so we may evaluate the integral without integrating to get ; 12 ;(n ; y + 2);(3 + y)
f (y ) =
;(n + 5)
n
y Y for y = 0 1 2 : : : n:
(b) For n = 2 we can use Theorem 5.14 to nd the expected value of Y E (Y ) = E (2p) = 2E (p) = 2 12 Z1
0 6
p3 (1 ; p)dp = 24;(4);(2) = 24 = 5
;(6)
20 We could also nd this directly from the marginal probability mass function for Y E (Y ) = 2
X
y 2
36
12y y (3 ; y)!(y + 2)! = 6 1 5 0 1 2 3 + 1 2 3 2 + 2 3 4 1] = 30 = 6 :
6!
5
=0 9 5.104 A coin has probability p of coming up heads when tossed. In n independent tosses of the coin, let X = 1
if the ith toss results in heads and X = 0 if the ith toss results in tails. Then Y the number of heads in
the n tosses, has a binomial distribution and can be represented as
i i Y= X
n X:
i =1 i Find E (Y ) and var(Y ) using Theorem 5.12.
Solution: The probability distribution of X is given by
i p
p(x ) = q if x = 1
if x = 0
i i i so the expected value and variance of X are given by
i E (X ) = 1 p + 0 q = p
var(X ) = 12 p + 02 q ; p2 = p ; p2 = pq:
i
i Since X and X are independent if i 6= j then cov(X X ) = 0 for all pairs (X X ) with i 6= j and we
have
i j i E (Y ) = E X!X
n n X=
i i E (X ) = np X!X =1 i var(Y ) = var j n i =1 n X=
i =1 i 10 var(X ) = npq
i i as it should be. i =1 j 5.106 A box contains four balls, numbered 1 through 4: One ball is selected at random from this box.
Let X1 = 1 if ball number 1 or ball number 2 is drawn
X2 = 1 if ball number 1 or ball number 3 is drawn
X3 = 1 if ball number 1 or ball number 4 is drawn
and the X values are zero otherwise. Show that any two of the random variables X1 X2 and X3 are
independent, but that the three together are not.
Solution: The marginal probabilities are given by
i P (X1 = 0) = 1
2
1
P (X2 = 0) = 2
P (X3 = 0) = 1
2 P (X1 = 1) = 1
2
1
P (X2 = 1) = 2
1
P (X3 = 1) = 2
while the joint probabilities are given by P (X1 = 1 X2 = 1) = P (ball 1)= 1
4
P (X1 = 0 X2 = 1) = P (ball 3)= 1
4
P (X1 = 1 X3 = 1) = P (ball 1)= 1
4
P (X1 = 0 X3 = 1) = P (ball 4)= 1
4 P (X1 = 1 X2 = 0)= P (ball 2) = 1
4
1
P (X1 = 0 X2 = 0)= P (ball 4) = 4
P (X1 = 1 X3 = 0)= P (ball 2) = 1
4
1
P (X1 = 0 X3 = 0)= P (ball 3) = 4 P (X2 = 1 X3 = 1) = P (ball 1)= 1 P (X2 = 1 X3 = 0)= P (ball 3) = 1
4
4
1 P (X = 0 X = 0)= P (ball 2) = 1
P (X2 = 0 X3 = 1) = P (ball 4)= 4
2
3
4
Note that X and X are pairwise independent since
1
P (X = x X = x ) = 1 = 1 2 = P (X = x ) P (X = x )
42
for all i 6= j: However,
P (X1 = 1 X2 = 1 X3 = 1) = P (ball 1) = 1 6= 1 = P (X1 = 1) P (X2 = 1) P (X3 = 1)
48
so that X1 X2 and X3 are not mutually independent.
i j i i j j i 11 i j j ...
View
Full
Document
 Fall '00
 Issac
 Statistics, Probability

Click to edit the document details