Markov Chains: Introduction
81
This shows that all finite-dimensional probabilities are specified once the transition
probabilities and initial distribution are given, and in this sense, the process is defined
by these quantities.
Related computations sho
1
Solutions to DIY Exercises of Chapters 5
Exercise 5.1.
Use the moment generating function to derive the mean and variance of a Poisson
random variable.
Solution Suppose X P(). Its moment generating function is
(t) E(etX ) =
etn n e /n! = e
n=0
n=0
Then,
35
Solutions to DIY Exercises of Chapter 6
Exercise 6.1
Does Postulate 1 for Poisson Process imply independent increment?
Solution. Yes, but it must be coupled with the requirement that it be a MC. Note that, with the
Markovian property, Postulate 1 impli
1
Solutions to DIY Exercises of Chapters 3 and 4.
Exercise 3.1 Bunny rabbit has three dens A, B and C. It likes A better than B and C. If its
in B or C on any night, it will always take chance 0.9 to go to A and chance 0.1 to go to the other
den for the f
Sample Exam of Math341, Stochastic Modeling
1. Let N (t) be a Poisson process with rate > 0. Let Wm = min(t : N (t) = m). Compute the
conditional probability of P (W1 > 1|W2 > 2)
Solution.
=
=
=
=
P (W1 > 1|W2 > 2) = P (N (1) = 0|N (2) 1)
key step
P (N (1
1
Solutions to Problems/Exercises of Chapter 4 (Homework # 4).
Chapter 4. The Long Run Behavior of Markov Chains.
Must do problems:
Exercise 1.3 Let j be the long run fraction of time staying in state j. Notice that P is
regular. Solve the equations
(0 1
Stochastic Modeling
MATH 3425
Spring 2014, HKUST
Instructor & T.A. .
Instructor: Kani Chen. Email: makchen@ust.hk; Phone: 2358-7425; Oce: Room
3477. Oce hour: Walk-in or by appointment.
TA:
T extbook & Ref erence Books.
Lecture Notes: To be posted every t
50
Review of Chapters 5 and 6
Chapter 5. Poisson Processes
1. Denitions of Poisson Distribution. (1). Probability function. (Scaled Taylor expansion of exponential function) (2). mean and variance.
2. Binomial approximation (Law of rare events): Suppose (
1
Solutions to DIY Exercises of Chapter 7
Exercise 7.1
Please Verify that N (t) + k for any k 1 is a stopping time, but N (t) is not.
Proof. For j k,
cfw_N (t) + k = j = cfw_N (t) = j k = cfw_Wjk t, Wjk+1 > t
which depend only on X1 , ., Xjk+1 , and there
24
Chapter 4. The Long Run Behavior of Markov Chains
In the long run, we are all equal. -with apology to John Maynard Keynes
4.1. Regular Markov chains.
Example 4.1 Let cfw_Xn be a M C with two states 0 and 1, and transition matrix:
P=
0.33 0.67
0.75 0.2
35
Chapter 5. Poisson Processes
Patience Pays.
The Poisson distribution is often referred to as the law of rare events. Specifically, it is the macrolaw (distribution) of micro-rare events. In general, rareness in micro scales can aggregate to become
com
17
3.6. Branching Processes
Branching process, as a typical discrete time Markov chain, is a very useful tool in epidemiologic
and social studies, particularly in modeling disease spread or population growth. chain.
(i). Example 3.11. (The Confucius desce
1
Solutions to Selected Problems/Exercises of Chapter 5 of the Textbook.
Chapter 5. Poisson Processes
Problem 2.1 Write
n 0
P (X(n, p) = 0) =
p (1 p)n = en log(1p) = enp+o(1) = e + o(1).
0
Then,
k+1
n
(1 p)nk1
P (X(n, p) = k + 1)
k+1 p
lim
= lim
n k
nk
61
Appendix: a forensic analysis of an exercise problem.
The aim of this lecture is actually not for review, but rather for a somewhat bigger purposeto
arouse your awareness of forming a nice habit of problem-solving, be it a mathematical or nonmathematic
Sample Exam of Math341, Stochastic Modeling
1. Let N (t) be a Poisson process with rate > 0. Let Wm = min(t : N (t) = m). Compute the
conditional probability of P (W1 > 1|W2 > 2)
2. Suppose Wk is the time to the k-th birth of a pure birth process cfw_X(t)
60
Review of Chapter 7, Renewal Phenomena
1. Denition of Renewal Process.
2. Renewal Theorem (Stopping time, Walds identity, optional sampling theorem).
3. Current, residual and total life times.
4. Understanding biased sampling.
5. Poisson process as spe
STOCHASTIC MODELING
Math3425
Spring 2012, HKUST
Kani Chen (Instructor)
Chapters 1-2.
Review of Probability Concepts Through Examples
We review some basic concepts about probability space through examples, in preparation for the
formal contents of this cou
6
Chapter 3.
Markov Chain: Introduction
Whatever happened in the past, be it glory or misery, be Markov!
3.1. Examples
Example 3.1. (Coin Tossing.)
Let 0 = 0 and, for i 1,
i =
1 if the i-th toss is a Head (with probability p)
0 if the i-th toss is a Tail
12
3.5. First step analysis
A journey of a thousand miles begins with the rst step. Lao Tse
Example 3.6. (Coin Tossing) Repeatedly toss a fair coin a number of times. Whats the
expected number of tosses till the rst two consecutive heads occur?
Let 0 = 0
24
Chapter 4. The Long Run Behavior of Markov Chains
In the long run, we are all equal. -with apology to John Maynard Keynes
4.1. Regular Markov chains.
Example 4.1 Let cfw_Xn be a M C with two states 0 and 1, and transition matrix:
0.33 0.67
0.75 0.25
P
17
3.6. Branching Processes
Branching process, as a typical discrete time Markov chain, is a very useful tool in epidemiologic
and social studies, particularly in modeling disease spread or population growth. chain.
(i). Example 3.11. (The Confucius desce
35
Chapter 5. Poisson Processes
Patience Pays.
The Poisson distribution is often referred to as the law of rare events. Specically, it is the macrolaw (distribution) of micro-rare events. In general, rareness in micro scales can aggregate to become
commo
51
Chapter 7. Renewal Phenomena
Renewal is life reborn.
7.1. Denitions and basic concepts.
7.1.1. Denition.
Suppose events happen along time and the time durations between any two consecutive events
(inter-occurrence times) are iid positive random variabl
40
Chapter 6. Continuous Time Markov Chains (Birth & Death Processes)
Birth is the inception of death, and death that of resurrection; Right is the inception of wrong, and
wrong that of right.
Zhuang Tse
Continuous time Markov chains refer to Markov proce
29
4.3. More sophisticated examples.
Markov Chains require conditional independence of future (tomorrow and after) and past (yesterday
and before) given a xed state of present (today). It often happens that the process in concern do
not satisfy this requi
45
6.2. Pure death processes
6.2.1. Postulates of pure death processes.
cfw_X(t) : t [0, ) is called a pure death process with parameters 0 = 0, 1 , ., N , and state
space cfw_0, 1, ., N , if it is a continuous time Markov chain with nonincreasing path sa
36
Review of Discrete time Markov Chains (Chapters 3 and 4)
1. A diagram.
Stochastic process
Markov process
(Homogeneous)
Brownian Motion (Chapter 8)
Discrete time
Continues time
Cont State space Cont State space
pure birth process (Chapter 6)
Discrete ti
56
7.3. The asymptotic behavior of renewal process.
X1 , X2 , . are iid positive inter-occurrence times with mean , variance 2 and cumulative distrin
bution funtion (cdf) F . Wn = i=1 Xi , n 1 are waiting times. (W0 = 0). And N (t) = maxcfw_n :
Wn t is th
1
Remarks about conditional expectation.
For a given event A with P (A) > 0 and a random variable X, the conditional expectation,
also called conditional mean, of X given A can be def ined as
E(X|A) = E(X1cfw_A )/P (A)
where
1 if A happens
0 otherwise.
N