Chapter 5
Multivariate Probability Distributions
Abstract:
In this chapter we will study the relationships among random variables, which will
be characterized by the joint probability distribution of random variables.
Most insights into
multivariate distributions can be gained by focusing on bivariate distributions. We first introduce
the joint probability distribution of a bivariate random vector (
X, Y
) via the characterization of
the joint cumulative distribution function, the joint probability mass function (when (
X, Y
) are
discrete), and the joint probability density function (when (
X, Y
) are continuous). We then char-
acterize various aspects of the relationship between
X
and
Y
using the conditional distributions,
correlation, and conditional expectations.
The concept of independence and its implications
on the joint distributions, conditional distributions and correlation are also discussed. We also
introduce a class of bivariate normal distribution and examine its important properties.
Key words:
Bivariate normal distribution, Bivariate transformation, Conditional distribution,
Conditional mean, Conditional variance, Correlation, Independence, Joint moment generating
function, Joint probability distribution, Law of iterated expectations, Marginal distribution.
5.1 Random Vectors and Joint Probability Distributions
Any economy is a system which consists of different units. These units are generally related
to each other. As a consequence, economic variables are interrelated. The most important goal
of economic analysis and econometric analysis is to identify the relationships between economic
events or economic variables.
As discussed in Chapter 2, given events
A
and
B,
their joint
probability
P
(
A
∩
B
) describes a predictive relationship. Such a relationship can be exploited to
predict one using the other.
Definition 1
(5.1)
. [Random Vector]
An
n
-dimensional random vector, denoted as
Z
=
(
Z
1
, Z
2
, ..., Z
n
)
0
,
is a function from a sample space
S
into
R
n
,
the
n
-dimensional Euclidean space.
For each outcome
s
∈
S, Z
(
s
) is an
n
-dimensional real-valued vector and is called a realization
of the random vector
Z.
In this chapter, we will mainly focus on bivariate probability distributions, which can illustrate
most (but not all) essentials of multivariate probability distributions.
We now consider two
random variables (
X, Y
) in most of the subsequent discussion, where both
X
and
Y
are defined
on the same probability space (
S,
B
, P
). A realization of (
X, Y
) will be a pair (
x, y
)
∈
R
2
.
Having defined a bivariate random vector (
X, Y
)
,
we can now discuss probabilities of events that
are defined in terms of (
X, Y
)
.
How to characterize the joint probability distribution of
X
and
Y
?
Like in the univariate case, we can use the CDF, now called the joint CDF, of
X
and
Y,
to
characterize their joint distribution.