This preview shows pages 1–3. Sign up to view the full content.
CS 70
Discrete Mathematics for CS
Spring 2008
David Wagner
Note 15
Introduction to Discrete Probability
Probability theory has its origins in gambling — analyzing card games, dice, roulette wheels. Today it is an
essential tool in engineering and the sciences. No less so in computer science, where its use is widespread
in algorithms, systems, learning theory and artificial intelligence.
Here are some typical statements that you might see concerning probability:
1. The chance of getting a flush in a 5card poker hand is about 2 in 1000.
2. The chance that a particular implementation of the primality testing algorithm outputs prime when the
input is composite is at most one in a trillion.
3. The average time between system failures is about 3 days.
4. In this loadbalancing scheme, the probability that any processor has to deal with more than 12 re
quests is negligible.
5. There is a 30% chance of a magnitude 8.0 earthquake in Northern California before 2030.
Implicit in all such statements is the notion of an underlying
probability space
. This may be the result of a
random experiment that we have ourselves constructed (as in 1, 2 and 3 above), or some model we build of
the real world (as in 4 and 5 above). None of these statements makes sense unless we specify the probability
space we are talking about: for this reason, statements like 5 (which are typically made without this context)
are almost contentfree.
Let us try to understand all this more clearly. The first important notion here is one of a random experiment.
An example of such an experiment is tossing a coin 4 times, or dealing a poker hand. In the first case an
outcome of the experiment might be
HTHT
or it might be
HHHT
. The question we are interested in might
be “what is the chance that there are exactly 2
H
’s?” Well, the number of outcomes that meet this condition
is
(
4
2
)
=
4!
2!2!
=
6 (corresponding to choosing the positions of the two
H
’s in the sequence of four tosses);
these outcomes are
HHTT
,
HTHT
,
HTTH
,
THHT
,
THTH
,
TTHH
. On the other hand, the total number
of distinct outcomes for this experiment is 2
4
=
16. If the coin is fair then all these 16 outcomes are equally
likely, so the chance that there are exactly 2
H
’s is 6
/
16
=
3
/
8.
As we saw with counting, there is a common framework in which we can view random experiments about
flipping coins, dealing cards, rolling dice, etc. A finite process is the following:
We are given a finite population
U
, of cardinality
n
. In the case of coin tossing,
U
=
{
H
,
T
}
, and in card
dealing,
U
is the set of 52 cards.
An experiment consists of drawing a sample of
k
elements from
U
. As before we will consider two cases:
sampling with replacement and sampling without replacement. Thus in our coin flipping example,
n
=
2
and the sample size is
k
=
4, and the sampling is with replacement. The outcome of the experiment is called
CS 70, Spring 2008, Note 15
1
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Documenta
sample point
. Thus
HTHT
is an example of a sample point. The
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '08
 PAPADIMITROU

Click to edit the document details