8. Discrete Multivariate Distributions
8.1
Basic Terminology and Techniques
Many problems involve more than a single random variable. When there are multiple random variables
associated with an experiment or process we usually denote them as
X,Y,.
..
or as
X
1
,X
2
,...
.Fo
r
example, your ﬁnal mark in a course might involve
X
1
=your assignment mark,
X
2
=your midterm test
mark, and
X
3
=your exam mark. We need to extend the ideas introduced for single variables to deal
with multivariate problems. Here we only consider discrete multivariate problems, though continuous
multivariate variables are also common in daily life (e.g. consider a person’s height
X
and weight
Y,
or
X
1
=
the return from Stock 1,
X
2
=
return from stock 2). To introduce the ideas in a simple setting,
we’ll ﬁrst consider an example in which there are only a few possible values of the variables. Later
we’ll apply these concepts to more complex examples. The ideas themselves are simple even though
some applications can involve fairly messy algebra.
Joint Probability Functions:
First, suppose there are two r.v.’s
X
and
Y
, and deﬁne the function
f
(
x, y
)=
P
(
X
=
x
and
Y
=
y
)
=
P
(
X
=
x, Y
=
y
)
.
We call
f
(
x, y
)
the joint probability function of
(
X,Y
)
. In general,
f
(
x
1
,x
2
,
···
n
P
(
X
1
=
x
1
and
X
2
=
x
2
and
...
and
X
n
=
x
n
)
if there are
n
random variables
X
1
,...,X
n
.
The properties of a joint probability function are similar to those for a single variable; for two r.v.’s we
have
f
(
x, y
)
≥
0
for all
(
x, y
)
and
X
all(x
,
y)
f
(
x, y
)=1
.
115
This preview has intentionally blurred sections. Sign up to view the full version.
View Full Document116
Example:
Consider the following numerical example, where we show
f
(
x, y
)
in a table.
x
f
(
x, y
)
012
1
.1
.2
.3
y
2
.2
.1
.1
for example
f
(0
,
2) =
P
(
X
=0
and
Y
=2
)=0
.
2
.
We can check that
f
(
x, y
)
is a proper
joint probability function since
f
(
x, y
)
≥
0
for all 6 combinations of
(
x, y
)
and the sum of these 6
probabilities is 1. When there are only a few values for
X
and
Y
it is often easier to tabulate
f
(
x, y
)
than to ﬁnd a formula for it. We’ll use this example below to illustrate other deﬁnitions for multivariate
distributions, but ﬁrstwegiveashortexamplewhereweneedto ﬁnd
f
(
x, y
)
.
Example:
Suppose a fair coin is tossed 3 times. Deﬁne the r.v.’s
X
= number of Heads and
Y
=1(0)
if
H
(
T
)
occurs on the ﬁrst toss. Find the joint probability function for
(
X,Y
)
.
Solution:
First we should note the range for
(
)
, which is the set of possible values
(
x, y
)
which
can occur. Clearly
X
canbe0
,1
,2
,or3and
Y
can be 0 or 1, but we’ll see that not all 8 combinations
(
x, y
)
are possible.
We can ﬁnd
f
(
x, y
)=
P
(
X
=
x, Y
=
y
)
by just writing down the sample space
S
=
{
HHH,HHT,HTH,THH,HTT,THT,TTH,TTT
}
that we have used before for this process.
Then simple counting gives
f
(
x, y
)
as shown in the following table:
x
f
(
x, y
)
0123
0
1
8
2
8
1
8
0
y
1
0
1
8
2
8
1
8
For example,
(
)=(0
,
0)
if and only if the outcome is
TTT
;(
)=(1
,
0)
iff the outcome
is either
THT
or
TTH
.
This is the end of the preview. Sign up
to
access the rest of the document.
 Spring '07
 A
 Probability theory, Markov chain, joint probability function, PROBABILITY FUNCTION

Click to edit the document details