6 Joint Distributions
In the last two Sections we saw distributions for a single unknown number. Next, well see distributions on two
or more unknown numbers.
Denition 37. Let X and Y be two discrete r
7.3 Classication of States
Denition 44 (Graphical representation of a Markov Chain). Let (Xn )nN be a Markov Chain on with transition
matrix P. We represent (Xn )nN by a directed graph such that the e
P (Xn = j|X0 = i) = (Pn )i,j . Observe that
P (Xn+1 = j|X0 = i) =
P (Xn+1 = j|Xn = k, X0 = i)P (Xn = k|X0 = i)
Law of Total Probability
k
P (Xn+1 = j|Xn = k, X0 = i)(Pn )i,k
=
Induction Hypothesis
k
P
Denition 40. A stochastic process is a collection of random variables (Xt )tT . One commonly thinks of t T
as a particular point in time and Xt as the state of a given system at time t. Frequent choic
8 Poisson Process
Denition 54. Let (Xt )tR be a Stochastic Process. A point process is such that, for each t R, Xt cfw_0, 1.
Xt = 1 denotes that an event happened at time t and Xt = 0 denotes that an
6.2 Univariate Transformations
Lemma 66. Let X be a discrete random variable and f be a function. If Y = f (X), then
pX (x)
pY (y) =
cfw_x:f (x)=y
Proof.
pY (y) = P (f (X) = y)
=
P (f (X) = y|X = x)P
7.5 Stationary Distribution and Invariant Measure
The previous classications help us understand some of the qualitative properties of Markov Chains.
Denition 51 (Stationary Distribution). Let (Xn )nN
Lemma 77. If is a reversible distribution for P, then is an invariant measure for P.
Proof. Assume is reversible. For every state x,
(y)Py,x =
y
x Px,y
since is reversible.
y
= x
Px,y
y
= x 1
rows of
Proof. If a > 0, the proof follows from Example 59. If a < 0,
FY (y) = P (Y y)
= P (aX + b y)
=P
X>
= 1 FX
yb
a
yb
a
Using Lemma 50,
fY (y) =
FY (y)
y
(1 FX ( yb )
a
y
yb
1
1
=
= fX
fX
a
a
|a|
=
yb
a