Notes for Use During the Midterm Exam; Stat 371; Fall 2011
Not on exam: Finite populations; formula for computing Poisson probabilities by hand; LLN.
Probability.
A chance mechanism (CM), when operated, produces an outcome.
The sample
space is the collection of all possible outcomes of the CM. An
event
is any collection of outcomes.
Before the CM is operated, the
probability
of the event
A
, denoted by
P
(
A
)
, is a number that
measures the
likelihood
that
A
will occur.
If we assume the ELC, then each possible outcome is equally likely to occur.
If we assume the ELC, then we assign probabilities to events as follows. For any event
A
,
P
(
A
) =
The number of outcomes in
A
The number of outcomes in
S
.
If we do not assume the ELC, there are two possibilities.
1. Suppose that the sample space is finite and consists of
k
possible outcomes:
1
,
2
, . . . k
. The
probability of outcome
j
is denoted by
p
j
. Each
p
j
≥
0
and they sum to one.
2. Suppose that the sample space is an infinite sequence and consists of possible outcomes:
0
,
1
,
2
, . . .
. The probability of outcome
j
is denoted by
p
j
. Each
p
j
≥
0
and they sum to one.
If A and B are events, then (A or B) is the event that contains all elements that are in A and/or
B; (AB) is the event that contains all elements that are in both A and B.
Two events, A and B, are called disjoint or mutually exclusive if they have no elements in
common; in other words, if AB is the empty set.
The Rules of Probability.
1. The probability of the sample space equals 1.
2. For any event
A
,
0
≤
P
(
A
)
≤
1
.
3. If
A
and
B
are disjoint events, then
P
(
A
or
B
) =
P
(
A
) +
P
(
B
)
.
4.
P
(
A
c
) = 1

P
(
A
)
.
5. If
A
is a subset of
B
, then
P
(
A
)
≤
P
(
B
)
.
6. For any events
A
and
B
,
P
(
A
or
B
) =
P
(
A
) +
P
(
B
)

P
(
AB
)
.
Trials.
Consider repeated operations of a CM. Each operation is called a trial and yields the value
of a random variable. The random variables are denoted by
X
1
for the first trial,
X
2
for the second
trial, and so on. Trials are i.i.d. if, and only if, the
X
i
’s all have the same probability distribution
(i.d.) and they are independent. The major consequence of independence is the multiplication rule.
For example,
P
(
X
1
= 3
, X
2
= 1
, X
3
= 4) =
P
(
X
1
= 3)
P
(
X
2
= 1)
P
(
X
3
= 4)
.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
This is the end of the preview.
Sign up
to
access the rest of the document.
 Fall '11
 hanlon
 Normal Distribution, Probability, Probability theory, Binomial distribution, Statistical hypothesis testing, possible outcomes, Bernoulli trials

Click to edit the document details