Unformatted text preview: Example class 4
STAT1301 Probability and Statistics I Chan Chi Ho, Chan Tsz Hin & Shi Yun October 4, 2011 Review Abbreviated event notation
Usually we abbreviate the description of the event or the set
{ω : X (ω ) ≤ x } to simply as {X ≤ x } so that we can also abbreviate
the probability notation P ({ω : X (ω ) ≤ x }) as P ({X ≤ x }) or even
simpler as P (X ≤ x ). This is primarily because the two probability
measures PX and P are consistent due to the property of the random
variable X . Review: cumulative distribution function (cdf) Deﬁnition of CDF
FX (x ) = P (X ≤ x ) = P ({ω : X (ω ) ≤ x }), where x ∈ (−∞, +∞)
Properties of CDF
1
2
3 0 ≤ FX ( x ) ≤ 1
If x1 < x2 , then FX (x1 ) ≤ FX (x2 ).
lim FX (x ) = FX (+∞) = 1
x →+∞ 4 lim FX (x ) = FX (−∞) = 0 x →−∞
5 lim FX (x ) = FX (a+ ) = FX (a), where a+ = lim (a + ε )
0<ε →0 x →a+
6
7
8 P (a < X ≤ b ) = FX (b ) − FX (a)
P (X > a) = 1 − FX (a)
P (X < b ) = FX (b − ), where b − = lim (b − ε )
0<ε →0 Review: probability mass function (pmf)
Probability mass function (PMF) is a function that gives the
probability that a discrete random variable is exactly equal to some
value.
Deﬁnition of PMF
pX (x ) = P (X = x ) = P ({ω : X (ω ) = x }), where x ∈ (−∞, +∞)
Properties of PMF
The state space of a discrete random variable X must be countable.
Denote the state space of a discrete r.v. X by S = {x1 , x2 , x3 , ...}.
1
2
3 0 ≤ pX (xk ) ≤ 1, k = 1, 2, ...
If x = xk for all k = 1, 2, ..., then pX (x ) = 0
∑ pX (xk ) = 1
k 4 FX (x ) = ∑xk ≤x pX (xk ) P (E ) = 0 E = φ
If X is a continuous random variable, then P (X = x ) = 0 for all real
x . For continuous random variable, we use probability density
function (pdf) instead of pmf. Continuous random variables will be
discussed in the next tutorial. Review: Mean and Variance Expectation
X is discrete E (X )
∑ xp (x ) x ∈X (Ω) X is continuous E (g (X ))
∑ g (x )p (x ) x ∈X (Ω) xf (x )dx
X (Ω) g (x )f (x )dx
X (Ω) Mean and Variance
Mean of X = E (X )
Variance of X = Var (X ) = E ((X − E (X ))2 ) = E (X 2 ) − E (X )2 . The
mean is usually denoted by µ and the variance is usually denoted by
σ 2. Review: moment generating function (mgf) Moment and moment generating function
Moment. Let r be a positive integer. The r th moment of X is E (X r ).
The r th moment of X about b is E ((X − b )r ).
Moment generating function. Let X be a random variable. The
moment generating function of X is deﬁned as MX (t ) = E (e tX ) if it
exists. The domain of MX (t ) is all real number t such that the
expectation E (e tX ) is ﬁnite. Also,
(r ) MX (0) = ∂ r MX (t )
∂ tr = E (X r ).
t =0 Moment generating function uniquely characterizes the distribution. Problem 1 The random experiment is “tosing a coin (not necessarily a fair one)
once. And if the coin turns out a head, then you earn 1 dollar,
otherwise 0 dollar.” You are interested in how many dollars you will
win after tossing it once. Deﬁne an appropriate random variable to
describe the experiment whose state space should be {0, 1} and derive
the PMF on its state space. Assume P ({H }) = p . What’s the
expected amount of money you will win? Solution to Problem 1
Solution
Let Ω = {H , T }, S = {0, 1}. Deﬁne random variable X : Ω → S as
X (H ) = 1
X (T ) = 0
P ({H }) = p . The PMF pX (·) is
pX (1) = P (X = 1) = P ({H }) = p
pX (0) = 1 − p
Hence
E (X ) = p · 1 + (1 − p ) · 0 = p Problem 2 The experiment changes to “Tossing a coin (not necessarily a fair
one) n times; if the coin turns out a head, then you earn 1 dollar,
otherwise, 0 dollar.” You are interested in how many dollars you will
win after the experiment, i.e., after tossing the coin n times. Deﬁne a
new random variable to describe the experiment whose state space
should be {0, 1, ..., n} and derive the PMF on its state space. Assume
P ({H }) = p . What’s the expected amount of money you will win?
What is the probability that you win at least some money? Solution to Problem 2 I Solution
Let Ω = {H , T } × {H , T } × ... × {H , T } =
{HH · · · H , HH · · · HT , · · · , TT · · · T }, S = {0, 1, ..., n}. Deﬁne random
variable Y : Ω → S as
Y (ω ) = number of H s in ω
Assume P ({H }) = p , the PMF pX (·) on S is
pY (k ) = P (Y = k ) =
where q = 1 − p . n
k p k (1 − p )n−k = n
k p k q n−k , Solution to Problem 2 II
Hence
n E (Y ) = ∑ n
k kp k q n−k ∑ n
k kp k q n−k k =0
n = k =0
n = k =0
n = n! ∑ (k − 1)!(n − k )! p k qn−k k =1
n−1 = n! ∑ k !(n − k )! · kp k qn−k n! ∑ k !(n − 1 − k )! p k +1 qn−1−k k =0
n−1 = np k =0 = np (n − 1)! ∑ k !(n − 1 − k )! p k qn−1−k Solution to Problem 2 III The probability that you win at least some money is
P (Y ≥ 1) = 1 − P (Y = 0)
= 1 − qn Problem 3 The experiment changes to “tossing a coin (not necessarily a fair one)
indeﬁnitely until it turns out a head. The number of dollars you will
win is equal to the number of tosses it takes to see the ﬁrst head
turning out.” You are interested in how many dollars you will win
after the experiment. Deﬁne a new random variable to describe the
experiment whose state space should be {1, 2, ...} and derive the pmf
on its state space. Assume P ({H }) = p . What’s the expected
amount of money you will win? What is the probability that you will
win at least 3 dollars? Solution to Problem 3 I Solution
Let Ω = {H , TH , TTH , TTTH , ...}, then Ω is a countably inﬁnite set.
S = {1, 2, 3, ...}. Deﬁne a random variable X : Ω → S as
X (ω ) = the length of ω
Assume P ({H }) = p . Then the pmf pX (·) on S is
pX (k ) = P (X = k ) = q k −1 p Solution to Problem 3 II
where we write q = 1 − p . Hence
∞ E (X ) = ∑ kqk −1 p k =1
∞ = p d ∑ dq qk k =1 =p
=
=
= d∞k
q
dq k∑1
=
d
q
dq 1 − q
p
(1 − q )2
1
p p Solution to Problem 3 III The probability that you win at least 3 dollars is
P (X ≥ 3) = 1 − P (X = 1) − P (X = 2)
= 1 − p − qp
= q2 Problem 4 The random experiment changes again to “tossing a coin (not
necessarily a fair one) indeﬁnitely until r heads have been observed.
The number of dollars you will win is the total number of tosses it
takes to see the r th head turning out.” You are interested in the
number of dollars you will win after the experiment. Deﬁne a new
random variable to describe the experiment whose state space should
be {r , r + 1, r + 2, ...} and derive the pmf on its state space. Assume
P ({H }) = p . What is the expected amount of money you will win? Solution to Problem 4 I Solution
Let Ω = {xxx...xH there are r − 1 H ’s among xxx...x}.
S = {r , r + 1, r + 2, ...}. Deﬁne a random variable X : Ω → S as
X (ω ) = length of ω
Assume P ({H }) = p , then the pmf pX (·) on S is
pX (k ) = k −1
r −1 p r q k −r Solution to Problem 4 II
where we write q = 1 − p .
∞ E (X ) = k −1
r −1 ∑ k =r
∞ = kp r q k −r k! ∑ (k − r )!(r − 1)! p r qk −r k =r = rp r
= rp r ∞ ∑ k
r ∑ k +r
r k =r
∞
k =0 r = rp ·
= r
p q k −r 1
(1 − q )r +1 qk
(negative binomial theorem) Problem 5 Let X follow a discrete uniform distribution on [a, b ], where a and b
are integers with a ≤ b . The pmf of X is
P (X = x ) = p (x ) = 1/(b − a + 1) if a ≤ x ≤ b and p (x ) = 0 otherwise.
1
2 Find the moment generating function of X .
Using part (a), or otherwise, compute E (X ) and Var (X ). Solution to Problem 5 I
Solution
1 To ﬁnd the mgf of X ,
MX (t ) = E (e tX ) = b e tx
∑
x =a b − a + 1 e ta
(1 + e t + e 2t + ... + e (b −a)t )
b −a+1 at e (1 − e (b −a+1)t )
if t = 0,
t
= (b − a + 1)(1 − e ) if t = 0.
1 at − e (b +1)t
e if t = 0,
=
(b − a + 1)(1 − e t ) if t = 0. 1 = Solution to Problem 5 II
2 We can ﬁnd E (X ) and E (X 2 ) by the formulae E (X ) = MX (0) and
E (X 2 ) = MX (0) since MX (t ) is twice diﬀerentiable at t = 0.
MX (t ) − MX (0)
t
at − e (b +1)t
e
−1
(b − a + 1)(1 − e t )
= lim
t
t →0+
at − e (b +1)t − (b − a + 1)(1 − e t )
e
= lim
t (b − a + 1)(1 − e t )
t →0+
lim t →0+ ae at − (b + 1)e (b +1)t + (b − a + 1)e t
t →0+ (b − a + 1)(1 − e t ) − (b − a + 1)te t = lim a2 e at − (b + 1)2 e (b +1)t + (b − a + 1)e t
−(b − a + 1)(2e t + te t )
t →0+
a2 − (b + 1)2 + (b − a + 1)
=
−2(b − a + 1)
a+b
=
2
= lim 0
()
0
0
(L’Hôpital’s rule)( )
0
(L’Hôpital’s rule) Solution to Problem 5 (cont’d) I MX (t ) − MX (0) a + b
=
. Hence,
t
2
a+b
a+b
MX (0) exists and equals
. Hence E (X ) =
.
2
2
Next we ﬁnd the variance. For convenience, deﬁne another r.v. Y by
Y = X − a + 1. Write n = b − a + 1. Then Y is discrete uniform on
a
[1, n]. E (Y ) = E (X − a + 1) = E (X ) − a + 1 = b −n+2 = n+1 . Also,
2
Var (X ) = Var (Y ). Therefore we need only ﬁnd Var (Y ). Note that
Similarly, we can check that lim−
t →0 t e − e (n+1)t
MY (t ) =
n(1 − e t ) 1 if t = 0,
if t = 0. Solution to Problem 5 (cont’d) II
We need to ﬁnd MY (0).
lim+ t →0 = lim+ MY (t ) − MY (0)
t
e t −(n+1)e (n+1)t +ne (n+2)t
n(1−e t )2 − n+1
2 t
− (n + 1)e (n+1)t + ne (n+2)t ) − n(n + 1)(1 − e t )2
= lim+
2nt (1 − e t )2
t →0
= . . . (application of L’Hôpital’s rule for three times)
t →0 2(e t = 2n 2 + 3n + 1
6 0
()
0 Solution to Problem 5 (cont’d) III
MX (t ) − MX (0) 2n2 + 3n + 1
=
. Hence
t
6
t →0
2 + 3n + 1
2n
MY (0) exists and equals
. Hence
6 Similarly, we have lim− Var (X ) = Var (Y )
= E (Y 2 ) − E (Y )2
2n 2 + 3n + 1
n+1 2
=
−(
)
6
2
n2 − 1
=
12
(b − a + 1)2 − 1
=
12 Solution to Problem 5 (cont’d) IV
Alternative solution
We may calculate E (X ) and Var (X ) directly without making use of
the mgf of X .
b E (X ) = x ∑ b −a+1 x =a =
=
= 1
(a + (a + 1) + ... + b )
b −a+1
1
(a + b )(b − a + 1)
·
b −a+1
2
a+b
2 Solution to Problem 5 (cont’d) V
To ﬁnd Var (X ), again we make use of the r.v. Y = X − a + 1. Y is
discrete uniform on [1, n]. Also, Var (X ) = Var (Y ). Hence we need
only ﬁnd Var (Y ).
n y2
x =1 n
1
=
· (12 + 22 + ... + n2 )
n
1 n(n + 1)(2n + 1)
=
·
n
6
(n + 1)(2n + 1)
=
6 E (Y 2 ) = ∑ Solution to Problem 5 (cont’d) VI Var (X ) = Var (Y ) = E (Y 2 ) − E (Y )2
(n + 1)(2n + 1) (n + 1)2
=
−
6
22
2 −1
n
=
12
(b − a + 1)2 − 1
=
12 Problem 6 Let N cells be numbered 1,2,...,N. We randomly throw balls into
them. The process ends when any one of the cells receives two balls.
What is the expected number of throws? Solution to Problem 6 I
Solution
Deﬁne X =the number of throws required. Hence 2 ≤ X ≤ N + 1.
Consider some simple cases ﬁrst.
P (X = 2) = 1
N (N − 1) 2
·
N
N
(N − 1) (N − 2) 3
P (X = 4) =
·
·
N
N
N
Thus, in general, for 2 ≤ k ≤ N + 1,
P (X = 3) = P (X = k ) =
= (N − 1) (N − 2)
(N − k + 2) k − 1
·
· ... ·
·
N
N
N
N
(N − 1)(N − 2) · ... · (N − k + 2)(k − 1)
N k −1 Solution to Problem 6 II Hence
N +1 E (X ) = ∑ kP (X = k ) k =2
N +1 = ∑ k =2 k (N − 1)(N − 2) · ... · (N − k + 2)(k − 1)
N k −1 Problem 7 Let X be a binomial random variable with parameters n and p . Show
that
1 − (1 − p )n+1
1
)=
.
E(
X +1
(n + 1)p Solution to Problem 7
Solution E( 1
)=
X +1 n 1 ∑ x +1 x =0 n
x = n
1
∑
n + 1 x =0 n+1
x +1 p x (1 − p )n−x = 1 n +1
n + 1 x∑1
= n+1
x p x −1 (1 − p )n+1−x =
=
= p x (1 − p )n−x 1
1 n +1 n + 1
p x (1 − p )n+1−x − (1 − p )n+1 )
· (∑
x
n + 1 p x =0
1
1
· (1 − (1 − p )n+1 )
n+1 p
1 − (1 − p )n+1
(n + 1)p Problem 8
At a dance party there are N married couples. Among them W men
and W ladies are randomly selected and paired oﬀ to dance a waltz.
After the waltz R ≥ 2 persons (can be all men or women) are then
randomly selected, again among the N married couples, to perform a
round dance together. It is known that Mr. and Mrs. Chan are one of
the N couples.
1 2 3 Show that the probability that Mr. and Mrs. Chan are paired oﬀ to
W
dance the waltz is 2 .
N
Show that the probability that Mr. and Mrs. Chan are selected to
R (R − 1)
dance the round dance is
.
2N (2N − 1)
Calculate the expected number of married couples among the round
dancers. Solution to Problem 8 I Solution
1 The probability that Mr. and Mrs. Chan are paired oﬀ to dance the
waltz is
N −1
N −1
(W − 1)!
W −1
W −1
W
= 2.
N
N
N
W!
W
W 2 The probability that Mr. and Mrs. Chan are selected to dance the
round dance is
2N − 2
R −2
R (R − 1)
=
.
2N (2N − 1)
2N
R Solution to Problem 8 II
3 Deﬁne
Xi = 1
0 if the i th married couples is selected for the round dance,
otherwise. Total number of married couples selected for the round dance
= X = X1 + X2 + ... + XN
E (X ) = E (X1 ) + E (X2 ) + ... + E (XN )
R (R − 1)
= N·
2N (2N − 1)
R (R − 1)
=
2(2N − 1) ...
View
Full Document
 Fall '08
 SMSLee
 Statistics, Probability, Probability theory, probability density function

Click to edit the document details