Chapter 1
Special Distributions
1
Special Distributions
Independent Bernoulli Trials
If
P
(
X
= 1) =
p
= 1
−
P
(
X
= 0), then
X
is said to be a
Bernoulli
(
p
) random variable. We
refer to the event [
X
= 1] as success, and to [
X
= 0] as failure.
Let
X
1
, . . . , X
n
be i.i.d. Bernoulli(
p
), and let
S
n
=
X
1
+
· · ·
+
X
n
denote the number of successes
in
n
independent Bernoulli(
p
) trials. Now
P
(
X
i
=
x
i
, i
= 1
, . . . , n
) =
p
∑
n
1
x
i
(1
−
p
)
n
−
∑
n
1
x
i
if all
x
i
equal 0 or 1; this formula gives the joint distribution of
X
1
, . . . , X
n
. From this we obtain
P
(
S
n
=
k
) =
n
k
p
k
(1
−
p
)
n
−
k
for
k
= 0
, . . . , n,
(1)
since each of the
(
n
k
)
different placings of
k
1’s in an
n
−
vector containing
k
1’s and
n
−
k
0’s has
probability
p
k
(1
−
p
)
n
−
k
from the previous sentence. We say that
S
n
∼
Binomial
(
n, p
) when (1)
holds. Note that Binomial(1
, p
) is the same as Bernoulli(
p
).
Let
X
1
, X
2
, . . .
be i.i.d. Bernoulli(
p
). Let
Y
1
≡
W
1
≡
min
{
n
:
S
n
= 1
}
. Since [
Y
1
=
k
] = [
X
1
=
0
, . . . , X
k
−
1
= 0
, X
k
= 1], we have
P
(
Y
1
=
k
) = (1
−
p
)
k
−
1
p
for
k
= 1
,
2
,
. . . .
(2)
We say that
Y
1
∼
Geometric
(
p
). Now let
W
m
≡
min
{
n
:
S
n
=
m
}
. We call
W
m
the waiting
time to the
m-th
success
. Let
Y
m
≡
W
m
−
W
m
−
1
for
m
≥
1, with
W
0
≡
0; we call the
Y
m
’s the
interarrival times
. Note that [
W
m
=
k
] = [
S
k
−
1
=
m
−
1
, X
k
= 1]. Hence
P
(
W
m
=
k
) =
k
−
1
m
−
1
p
m
(1
−
p
)
k
−
m
for
k
=
m, m
+ 1
,
. . . .
(3)
We say that
W
m
∼
Negative Binomial
(
m, p
).
Exercise 1.1
Show that
Y
1
, Y
2
, . . .
are i.i.d. Geometric(
p
).
Since the number of successes in
n
1
+
n
2
trials is the number of successes in the first
n
1
trials plus
the number of successes in the next
n
2
trials, it is clear that for independent
Z
i
∼
Binomial(
n
i
, p
),
Z
1
+
Z
2
∼
Binomial
(
n
1
+
n
2
, p
)
.
(4)
3