Unformatted text preview: Example class 5
STAT1301 Probability and Statistics I Chan Chi Ho, Chan Tsz Hin & Shi Yun October 12, 2011 Review: Some common discrete distributions
Table of Important Discrete Distributions
Name ( for which
(
) )
these for () () Bernoulli One trial; 1 if success, 0 if failure Number of success in
trials () Binomial Bernoulli Number of Bernoulli trials needed
to get one success Geometric
Pascal
(Negative
Binomial) ( Number of Bernoulli trials needed
to get success ) Approximates binomial for
large, but
not large Poisson
( ) ( )( Hypergeometric
(
Uniform Description ) () ) ( ) Number of red balls chosen when
balls are chosen from balls,
of which are red
Choose one of the numbers Review: Probability Density Function (PDF) Deﬁnition of PDF
dFX (x )
P (X ≤x + dx ) − P (X ≤x )
= lim
dx
dx
(dx →0)
P (X < x + dx ) − P (X ≤x )
=
lim
dx
(dx →0)
P (X ≤x + dx ) − P (X < x )
=
lim
dx
(dx →0)
P (X < x + dx ) − P (X < x )
=
lim
dx
(dx →0) fX (x ) = Review: Probability Density Function (PDF) Properties of PDF
1
2
3
4 fX (x )≥0
+∞
−∞ fX (x )dx = 1
x
FX (x ) = −∞ fX (t )dt
P (a < X ≤b ) = P (a≤X ≤b ) = P (a≤X < b ) = P (a < X < b ) =
b
a f (x )dx = FX (b ) − FX (a) Review: Markov’s inequality Markov’s inequality
If X is a nonnegative random variable with ﬁnite mean E (X ), then for
any c > 0,
E (X )
.
P (X ≥ c ) ≤
c Proof.
Note that since X ≥ 0,
cI {X ≥ c } ≤ X
The inequality follows by taking expectation on both sides. Review: Chebyshev’s inequality
Chebyshev’s inequality
If the random variable X has ﬁnite mean µ and ﬁnite variance σ 2 ,
then for any real number k > 0,
P (X − µ  ≥ k σ ) ≤ 1
.
k2 Proof.
By Markov’s inequality,
P ((X − µ )2 ≥ k 2 σ 2 ) ≤ 1
E ((X − µ )2 )
=2
2σ 2
k
k Problem 1 Two balls are chosen randomly from an urn containing 7 white, 4 black,
and 1 orange balls. Suppose that we win $1 for each white ball drawn and
we lose $1 for each orange ball drawn. Denote $X as the amount that we
can win.
1 What are the possible values of X ? 2 Determine the probability mass function and cumulative distribution
function of X . Solution to Problem 1 I Sample space: Ω = {WW , WB , WO , BB , BO }
Note that the outcomes are not equally likely with associated probabilities:
(7)
7
×
×
7
2
P ({WW }) = 12 = 22 , P ({WB }) = 7664 = 14 , P ({WO }) = 7661 = 66 ,
33
(2)
(4)
1
×
2
2
P ({BB }) = 66 = 11 , P ({BO }) = 4661 = 33 . Solution to Problem 1 The random variable X is the following function:
The the possible values of X are –1, 0, 1, 2.
The probability mass function is given by
2
P (X = −1) = P ({BO }) = 33
7
1
P (X = 0) = P ({WO , BB }) = 66 + 11 = 13
66
14
P (X = 1) = P ({WB }) = 33
7
P (X = 2) = P ({WW }) = 22
The cumulative distribution function is 0 P (X = −1) = 2/33 F (x ) = P (X ≤ x ) = P (X = −1) + P (X = 0) = 17/66 P (X = −1) + P (X = 0) + P (X = 1) = 15/22 1 if x < −1
if − 1 ≤ x < 0
if 0 ≤ x < 1 .
if 1 ≤ x < 2
if x ≥ 2 Problem 2 In each box of a given product there is a coupon with a number ranging
from 1 to 6. If a housewife succeeds in collecting all the coupons of
number 1 to 6 then she can win a prize. Assume that each number is
equally likely to appear. What is the expected number of boxes she should
buy in order to win the prize? Solution to Problem 2 I Let X1 be the number of boxes need to be bought to get a coupon. X2 be
the number of boxes need to be bought to get a diﬀerent coupon from the
ﬁrst. X3 be the number of boxes need to be bought to get a diﬀerent
coupon from the ﬁrst and second. Similarly for X4 , X5 , X6 . Let X be the
total number of boxes need to be bought in order to collect all 6 diﬀerent
coupons. Then X = X1 + X2 + · · · + X6 . We need to ﬁnd E (X ). Solution to Problem 2 But we know E (X ) = E (X 1 ) + E (X 2 ) + · · · + E (X6 ). Note that X1 = 1,
and so E (X 1 ) = 1. For X2 , since we already obtained a coupon, the
probability that we obtain a new coupon in the next trial will be 5 .
6
5
Hence, P (X2 = x ) = (1 − 5 )x −1 5 . i.e. X2 is a geometric r.v. with p = 6 .
6
6
6
Hence E (X2 ) = 5 .
Similarly, X3 is a geometric r.v. with p = 4 . X4 , X5 , X6 are geometric
6
32
random variables with p = 6 , 6 and 1 respectively.
6
Finally, we can compute E (X ).
6
E (X ) = E (X 1 ) + E (X 2 ) + · · · + E (X6 ) = 1 + 5 + 6 + · · · + 6 = 14.7.
4
1 Problem 3 Suppose that airplane engines will fail, when in ﬂight, with probability
1 − p independently from engine to engine. If an airplane needs a majority
of its engines operative to make a successful ﬂight, for what values of p is
a 5engine plane preferable to a 3engine one? Solution to problem 3 I
The number of operative engines in a 5engine plane follows binomial distribution
b (5, p ).
P (5 − engine plane fail) = P (at least 3 engines fail) = P (at most 2 engines operative) = (1 − p )5 + 5(1 − p )4 p + 10(1 − p )3 p 2 = (1 − p )3 (1 + 3p + 6p 2 ) The number of operative engines in a 3engine plane follows binomial distribution
b (3, p ).
P (3 − engine plane fail) = P (at least 2 engines fail) = P (at most 1 engines operative) = (1 − p )3 + 3(1 − p )2 p = (1 − p )2 (1 + 2p ) Solution to problem 3 Hence P (5 − engine plane fail) − P (3 − engine plane fail)
= (1 − p )2 [(1 − p )(1 + 3p + 6p 2 ) − (1 + 2p )]
= 3p 2 (1 − p )2 (1 − 2p )
This diﬀerence is negative whenever p is greater than 0.5. Hence a
5engine plane is preferable to a 3engine one if p is greater than 0.5, i.e.
if each engine has more than half chance to be successfully functioning. Problem 4 1 Verify that Poisson distribution can approximate Binomial distribution
when the number of Bernoulli trials is very large and Pr(succuss ) very
small, while the mean remain ﬁnite. To be precise, suppose X has a
binomial distribution with parameters n and p . If p → 0 and np → l
as n → +∞ then
λk
.
k!
n = 100, p = 0.05, λ = 5, X ∼ B (n, p ) and Y ∼ Pois (λ ). Compare the
values ofPX (X ≤ 5) and PY (Y ≤ 5).
P (X = k ) → e −λ 2 Solution to Problem 4 I
1 The following manipulation is purely technical: as n → ∞, let λ = np ,
n!
p k (1 − p )n−k
k !(n − k )!
= n!
k !(n − k )! p
1−p = n!
k !(n − k )! λ
n−λ k 1− λ
n n k e −λ = e −λ → n!
k !(n − k )! p
1−p k e −λ n!
λk
k ! (n − k )!(n − λ )k Since as n → ∞,
n!
(n − k )!(n − λ )k =
= n n−1
n−k +1
···
n−λ n−λ
n−λ
λ
λ −1
λ −k +1
1+
1+
··· 1+
n−λ
n−λ
n−λ we have
n!
λk
p k (1 − p )n−k −→ e −λ
.
k !(n − k )!
k! −→ 1 Solution to Problem 4 II
2 This part gives you the practical explanation of the term “Poisson
approximation to Binomial”. It also shows how possibly tedious can a
statistical problem be and by this way motivates you to use software
packages such as the free R language (cran.rproject.org).
pX (0) = (0.95)100 = 0.005920529
pX (1) = (100)(0.05)(0.95)99 = 0.031160680
00
pX (2) = 12 (0.05)2 (0.95)98 = 0.081181772
100
pX (3) = 3 (0.05)3 (0.95)97 = 0.139575678
00
pX (4) = 14 (0.05)4 (0.95)96 = 0.178142642
100
pX (5) = 5 (0.05)5 (0.95)95 = 0.1800178
PX (X ≤5) = 0.6159991
pY (0) = e −5 = 0.006737947
pY (1) = 5e −5 = 0.033689735
pY (2) = 25 e −5 = 0.084224337
2
pY (3) = 125 e −5 = 0.140373896
6
4
pY (4) = 5 ! e −5 = 0.175467370
4
5 pY (5) = 5 ! e −5 = 0.175467370
5
PY (Y ≤ 5) = 0.6159607 Problem 5 The number of times that an individual contracts a cold in a given year is
a Poisson random variable with mean θ = 6. Suppose a new wonder drug
(based on large quantities of vitamin C) has just been marketed that
reduces the Poisson mean to θ = 4 for 60 percent of the population. For
the other 40 percent of the population the drug has no appreciable eﬀect
on colds. If an individual tries the drug for a year and has 3 colds in that
time, how likely is it that the drug is beneﬁcial for him/her? (HINT: Use
the Bayes’ rule) Solution to Problem 5
Let X be the no. of colds the individual contracts within a year. Then
X ∼ P (4) if the drug is eﬀective and X ∼ P (6) if the drug is not
eﬀective. Denote E as the event that the drug is eﬀective, then
e −4 43 32e −4
=
,
3!
3 P (X = 3E c ) = P (E ) = 0.6, P (X = 3E ) = e −6 63
= 36e −6
3! P (E c ) = 0.4 Hence using law of total probability and Bayes’ theorem,
P (X = 3) = 0.6 × P (E X = 3) = 32e −4
+ 0.4 × 36e −6 = 57.6e −4 + 14.4e −6
3 P (X = 3E )P (E )
57.6e −4
=
= 0.9673
P (X = 3)
57.6e −4 + 14.4e −6 Problem 6 Let f (x ) denote the density function of the random variable X . Suppose
m has a symmetric distribution about a, that is f (x + a) = f (a − x ) for
every x . Show that E (X ) = a if it exists. Solution to Problem 6 I +∞ E (X ) = xf (x )dx
−∞
a = +∞ xf (x )dx +
−∞
0 = xf (x )dx
a
0 −∞
0 =a −∞
0 f (a − u )du f (u + a)du + a
−∞
a =a −∞
+∞ f (x )dx + a
−∞
+∞ =a f (x )dx
a f (x )dx
−∞ =a (a − u )f (a − u )du (u + a)f (u + a)du + Problem 7 Denote MX (t ) as the moment generating function of a random variable X .
1 Show that for any real number a,
P (X ≥ a) ≤ e −ta MX (t ) for all t > 0; and
P (X ≤ a) ≤ e −ta MX (t ) for all t < 0. 2 Let X be a Poisson random variable with mean θ . Show that
P (X ≥ x ) ≤ e −θ ( eθ x
)
x for any x > θ . Solution to Problem 7 I
1 First assume t > 0. Then
I {X ≥ a} = I {e tX ≥ e ta } = I {1 ≤ e tX
e tX
} ≤ ta
e ta
e Taking expectation,
E {I {X ≥ a}} ≤ e −ta E (e tX )
P (X ≥ a) ≤ e −ta MX (t ), for all t > 0
Hence we have proved the ﬁrst inequality. To prove the second
inequality, replace X by −X and a by −a in the ﬁrst inequality, then
by the ﬁrst inequality,
P (−X ≥ −a) ≤ e ta M−X (t ) for all t > 0 ta P (X ≤ a) ≤ e MX (−t ) for all t > 0
Hence, replacing −t by t , we have
P (X ≤ a) ≤ e −ta MX (t ) for all t < 0 Solution to Problem 7 II 2 By the ﬁrst inequality in part (a),
P (X ≥ x ) ≤ e −tx MX (t ) = e −tx e θ (e t −1) = e −θ e θ e t −tx x
Since θ e t − tx is minimized at t = ln( ) > 0 (since x > θ > 0) and
θ
the inequality is true for all t > 0, we have
P (X ≥ x ) ≤ e −θ e x −x ln(x /θ ) = e −θ ( eθ x
)
x for all x > θ . ...
View
Full Document
 Fall '08
 SMSLee
 Statistics, Bernoulli, Normal Distribution, Probability, Probability theory, Cumulative distribution function

Click to edit the document details