6 Joint Distributions
In the last two Sections we saw distributions for a single unknown number. Next, well see distributions on two
or more unknown numbers.
Denition 37. Let X and Y be two discrete random variables. The joint pmf of X and Y is pX,Y (x, y
7.3 Classication of States
Denition 44 (Graphical representation of a Markov Chain). Let (Xn )nN be a Markov Chain on with transition
matrix P. We represent (Xn )nN by a directed graph such that the elements of are nodes and there is an arrow
from i to j,
P (Xn = j|X0 = i) = (Pn )i,j . Observe that
P (Xn+1 = j|X0 = i) =
P (Xn+1 = j|Xn = k, X0 = i)P (Xn = k|X0 = i)
Law of Total Probability
k
P (Xn+1 = j|Xn = k, X0 = i)(Pn )i,k
=
Induction Hypothesis
k
P (Xn+1 = j|Xn = k)(Pn )i,k
=
Markovian Property
k
Pk,j
Example 71. Consider the Markov Chain with graphical representation
!
1
2
By denition, T 11 = cfw_(1, 2, 1), (1, 2, 1, 2, 1), (1, 2, 1, 2, 1, 2, 1), . . .). Hence N 11 = cfw_2, 4, 6, . . . and the period of
1 is 2. Similarly, the period of 2 is also 2.
Ex
Denition 40. A stochastic process is a collection of random variables (Xt )tT . One commonly thinks of t T
as a particular point in time and Xt as the state of a given system at time t. Frequent choices for T are N and R
- discrete and continuous time.
7
Observe that
1
1
cdydx
fX,Y (x, y)dydx =
0
0
1
=
1
cy 0 dx
0
1
=
cdx = cx
0
1
0
=c
Hence, c = 1. Observe that, if 0 < x < 1
FX (x) = P (X x)
= P (X x Y R)
x
fX,Y (x, y)dydx
=
x
1
1dydx
=
0
0
Similarly, if x < 0, then FX (x) = 0 and if x > 1, then FX (x)
8 Poisson Process
Denition 54. Let (Xt )tR be a Stochastic Process. A point process is such that, for each t R, Xt cfw_0, 1.
Xt = 1 denotes that an event happened at time t and Xt = 0 denotes that an event didnt happen at time t.
Denition 55. Let (Xt )tR
6.2 Univariate Transformations
Lemma 66. Let X be a discrete random variable and f be a function. If Y = f (X), then
pX (x)
pY (y) =
cfw_x:f (x)=y
Proof.
pY (y) = P (f (X) = y)
=
P (f (X) = y|X = x)P (X = x)
Law of Total Probability (Lemma 1)
xIm[X]
P (X
7.5 Stationary Distribution and Invariant Measure
The previous classications help us understand some of the qualitative properties of Markov Chains.
Denition 51 (Stationary Distribution). Let (Xn )nN be a Markov Chain with transition matrix P. For each
st
Lemma 77. If is a reversible distribution for P, then is an invariant measure for P.
Proof. Assume is reversible. For every state x,
(y)Py,x =
y
x Px,y
since is reversible.
y
= x
Px,y
y
= x 1
rows of P sum to 1.
Hence, P = and is invariant.
Example 76. Co
Proof. If a > 0, the proof follows from Example 59. If a < 0,
FY (y) = P (Y y)
= P (aX + b y)
=P
X>
= 1 FX
yb
a
yb
a
Using Lemma 50,
fY (y) =
FY (y)
y
(1 FX ( yb )
a
y
yb
1
1
=
= fX
fX
a
a
|a|
=
yb
a
6.3 Exercises
Exercise 133. Use Lemma 68 to prove the f