1
Introduction to random variables
Example 1.1:
The experiment is to toss a fair coin 2 times. Recall the sample space,
S
=
{
HH, HT, TH, TT
}
.
which we will assume has equally-likely outcomes (i.e., each has probability 1/4). Let
X
=
the number of “heads” we will see if we perform this experiment.
•
X
can take on values: 0, 1, or 2.
•
For example, the event
X
= 1 has the outcomes
{
HT, TH
}
•
We can compute the probabilities of the events
X
= 0,
X
= 1,
X
= 2 using the
equally-likely rule:
P
(
X
= 0) =
P
(
{
TT
}
) = 1
/
4
P
(
X
= 1) =
P
(
{
HT, TH
}
) = 2
/
4 = 1
/
2
P
(
X
= 2) =
P
(
{
HH
}
) = 1
/
4
•
We can compute the probabilities of other events, for example:
P
(
X
≤
1) =
P
(
{
HT, TH, TT
}
) = 3
/
4
•
We call
X
a random variable.
1.1
Definitions
Recall the definition of an experiment from before:
Definition: experiment
action or process that generates outcomes. Only one outcome can occur and we are usually
uncertain which outcome this will be.
Definition: random variable
a numerical measurement of the outcome of an experiment that has yet to be performed.
We will study discrete random variables and continuous random variables.
•
discrete:
the set of possible values has a finite or countable number of elements/values.
(e.g.
{
0
,
1
}
,
{-
1
.
5
,
0
,
1
.
5
,
2
,
3
}
,
{
1
,
2
,
3
, . . .
}
•
continuous:
the set of possible vales has an uncountably infinite number of ele-
ments/values, such as intervals, (e.g. [0
,
1]
,
(
-∞
,
∞
)). Also,
P
(
X
=
c
) = 0 for any
possible value of
c
.
1