6 Joint Distributions
In the last two Sections we saw distributions for a single unknown number. Next, well see distributions on two
or more unknown numbers.
Denition 37. Let X and Y be two discrete random variables. The joint pmf of X and Y is pX,Y (x, y
7.3 Classication of States
Denition 44 (Graphical representation of a Markov Chain). Let (Xn )nN be a Markov Chain on with transition
matrix P. We represent (Xn )nN by a directed graph such that the elements of are nodes and there is an arrow
from i to j,
P (Xn = j|X0 = i) = (Pn )i,j . Observe that
P (Xn+1 = j|X0 = i) =
P (Xn+1 = j|Xn = k, X0 = i)P (Xn = k|X0 = i)
Law of Total Probability
P (Xn+1 = j|Xn = k, X0 = i)(Pn )i,k
P (Xn+1 = j|Xn = k)(Pn )i,k
Example 71. Consider the Markov Chain with graphical representation
By denition, T 11 = cfw_(1, 2, 1), (1, 2, 1, 2, 1), (1, 2, 1, 2, 1, 2, 1), . . .). Hence N 11 = cfw_2, 4, 6, . . . and the period of
1 is 2. Similarly, the period of 2 is also 2.
Denition 40. A stochastic process is a collection of random variables (Xt )tT . One commonly thinks of t T
as a particular point in time and Xt as the state of a given system at time t. Frequent choices for T are N and R
- discrete and continuous time.
fX,Y (x, y)dydx =
cy 0 dx
cdx = cx
Hence, c = 1. Observe that, if 0 < x < 1
FX (x) = P (X x)
= P (X x Y R)
fX,Y (x, y)dydx
Similarly, if x < 0, then FX (x) = 0 and if x > 1, then FX (x)
8 Poisson Process
Denition 54. Let (Xt )tR be a Stochastic Process. A point process is such that, for each t R, Xt cfw_0, 1.
Xt = 1 denotes that an event happened at time t and Xt = 0 denotes that an event didnt happen at time t.
Denition 55. Let (Xt )tR
6.2 Univariate Transformations
Lemma 66. Let X be a discrete random variable and f be a function. If Y = f (X), then
pY (y) =
pY (y) = P (f (X) = y)
P (f (X) = y|X = x)P (X = x)
Law of Total Probability (Lemma 1)
7.5 Stationary Distribution and Invariant Measure
The previous classications help us understand some of the qualitative properties of Markov Chains.
Denition 51 (Stationary Distribution). Let (Xn )nN be a Markov Chain with transition matrix P. For each
Lemma 77. If is a reversible distribution for P, then is an invariant measure for P.
Proof. Assume is reversible. For every state x,
since is reversible.
= x 1
rows of P sum to 1.
Hence, P = and is invariant.
Example 76. Co
Proof. If a > 0, the proof follows from Example 59. If a < 0,
FY (y) = P (Y y)
= P (aX + b y)
= 1 FX
Using Lemma 50,
fY (y) =
(1 FX ( yb )
Exercise 133. Use Lemma 68 to prove the f