ECEN 303: Random Signals and Systems
Lecture 8: Jointly Distributed Random Variables
1 Joint Distribution Functions
Thus far, we have only concerned ourselves with probability distributions for single random
variables. However, we are often interested in probability statements concerning two or more
random variables. In order to deal with such probabilities, we deﬁne, for any two random
variables
X
and
Y
, the
joint
CDF of
X
and
Y
by
F
(
x,y
) = Pr(
X
≤
x,Y
≤
y
)
The CDF of
X
and
Y
can be obtained from the joint CDF of
X
and
Y
as follows:
F
X
(
x
) = Pr(
X
≤
x
) = Pr(
X
≤
x,Y <
∞
) =
F
(
x,
∞
)
and
F
Y
(
y
) = Pr(
Y
≤
y
) = Pr(
X <
∞
,X
≤
x
) =
F
(
∞
,y
)
The distribution functions
F
X
(
·
) and
F
Y
(
·
) are usually referred to as the
marginal
CDF of
X
and
Y
, respectively.
All joint probability statements about
X
and
Y
can, in theory, be answered in terms of their
joint CDF. For example, for any
x
1
< x
2
and
y
1
< y
2
,
Pr(
x
1
< X
≤
x
2
,y
1
< Y
≤
y
2
) = Pr(
X
≤
x
2
,Y
≤
y
2
)

Pr(
X
≤
x
2
,Y
≤
y
1
)

Pr(
X
≤
x
1
,Y
≤
y
2
) + Pr(
X
≤
x
1
,Y
≤
y
1
)
=
F
(
x
2
,y
2
)

F
(
x
2
,y
1
)

F
(
x
1
,y
2
) +
F
(
x
1
,y
1
)
In case when
both
X
and
Y
are discrete random variables, it is convenient to deﬁne the
joint
PMF of
X
and
Y
by
p
(
x,y
) = Pr(
X
=
x,Y
=
y
)
1