Markov Chains: Introduction
81
This shows that all finite-dimensional probabilities are specified once the transition
probabilities and initial distribution are given, and in this sense, the process i
1
Solutions to DIY Exercises of Chapters 5
Exercise 5.1.
Use the moment generating function to derive the mean and variance of a Poisson
random variable.
Solution Suppose X P(). Its moment generating
35
Solutions to DIY Exercises of Chapter 6
Exercise 6.1
Does Postulate 1 for Poisson Process imply independent increment?
Solution. Yes, but it must be coupled with the requirement that it be a MC. No
1
Solutions to DIY Exercises of Chapters 3 and 4.
Exercise 3.1 Bunny rabbit has three dens A, B and C. It likes A better than B and C. If its
in B or C on any night, it will always take chance 0.9 to
Sample Exam of Math341, Stochastic Modeling
1. Let N (t) be a Poisson process with rate > 0. Let Wm = min(t : N (t) = m). Compute the
conditional probability of P (W1 > 1|W2 > 2)
Solution.
=
=
=
=
P (
1
Solutions to Problems/Exercises of Chapter 4 (Homework # 4).
Chapter 4. The Long Run Behavior of Markov Chains.
Must do problems:
Exercise 1.3 Let j be the long run fraction of time staying in state
50
Review of Chapters 5 and 6
Chapter 5. Poisson Processes
1. Denitions of Poisson Distribution. (1). Probability function. (Scaled Taylor expansion of exponential function) (2). mean and variance.
2.
1
Solutions to DIY Exercises of Chapter 7
Exercise 7.1
Please Verify that N (t) + k for any k 1 is a stopping time, but N (t) is not.
Proof. For j k,
cfw_N (t) + k = j = cfw_N (t) = j k = cfw_Wjk t, W
24
Chapter 4. The Long Run Behavior of Markov Chains
In the long run, we are all equal. -with apology to John Maynard Keynes
4.1. Regular Markov chains.
Example 4.1 Let cfw_Xn be a M C with two state
35
Chapter 5. Poisson Processes
Patience Pays.
The Poisson distribution is often referred to as the law of rare events. Specifically, it is the macrolaw (distribution) of micro-rare events. In genera
17
3.6. Branching Processes
Branching process, as a typical discrete time Markov chain, is a very useful tool in epidemiologic
and social studies, particularly in modeling disease spread or population
1
Solutions to Selected Problems/Exercises of Chapter 5 of the Textbook.
Chapter 5. Poisson Processes
Problem 2.1 Write
n 0
P (X(n, p) = 0) =
p (1 p)n = en log(1p) = enp+o(1) = e + o(1).
0
Then,
k+
61
Appendix: a forensic analysis of an exercise problem.
The aim of this lecture is actually not for review, but rather for a somewhat bigger purposeto
arouse your awareness of forming a nice habit of
Sample Exam of Math341, Stochastic Modeling
1. Let N (t) be a Poisson process with rate > 0. Let Wm = min(t : N (t) = m). Compute the
conditional probability of P (W1 > 1|W2 > 2)
2. Suppose Wk is the
60
Review of Chapter 7, Renewal Phenomena
1. Denition of Renewal Process.
2. Renewal Theorem (Stopping time, Walds identity, optional sampling theorem).
3. Current, residual and total life times.
4. U
STOCHASTIC MODELING
Math3425
Spring 2012, HKUST
Kani Chen (Instructor)
Chapters 1-2.
Review of Probability Concepts Through Examples
We review some basic concepts about probability space through examp
6
Chapter 3.
Markov Chain: Introduction
Whatever happened in the past, be it glory or misery, be Markov!
3.1. Examples
Example 3.1. (Coin Tossing.)
Let 0 = 0 and, for i 1,
i =
1 if the i-th toss is a
12
3.5. First step analysis
A journey of a thousand miles begins with the rst step. Lao Tse
Example 3.6. (Coin Tossing) Repeatedly toss a fair coin a number of times. Whats the
expected number of toss
24
Chapter 4. The Long Run Behavior of Markov Chains
In the long run, we are all equal. -with apology to John Maynard Keynes
4.1. Regular Markov chains.
Example 4.1 Let cfw_Xn be a M C with two state
17
3.6. Branching Processes
Branching process, as a typical discrete time Markov chain, is a very useful tool in epidemiologic
and social studies, particularly in modeling disease spread or population
35
Chapter 5. Poisson Processes
Patience Pays.
The Poisson distribution is often referred to as the law of rare events. Specically, it is the macrolaw (distribution) of micro-rare events. In general,
51
Chapter 7. Renewal Phenomena
Renewal is life reborn.
7.1. Denitions and basic concepts.
7.1.1. Denition.
Suppose events happen along time and the time durations between any two consecutive events
(
40
Chapter 6. Continuous Time Markov Chains (Birth & Death Processes)
Birth is the inception of death, and death that of resurrection; Right is the inception of wrong, and
wrong that of right.
Zhuang
29
4.3. More sophisticated examples.
Markov Chains require conditional independence of future (tomorrow and after) and past (yesterday
and before) given a xed state of present (today). It often happen
45
6.2. Pure death processes
6.2.1. Postulates of pure death processes.
cfw_X(t) : t [0, ) is called a pure death process with parameters 0 = 0, 1 , ., N , and state
space cfw_0, 1, ., N , if it is a
36
Review of Discrete time Markov Chains (Chapters 3 and 4)
1. A diagram.
Stochastic process
Markov process
(Homogeneous)
Brownian Motion (Chapter 8)
Discrete time
Continues time
Cont State space Cont
56
7.3. The asymptotic behavior of renewal process.
X1 , X2 , . are iid positive inter-occurrence times with mean , variance 2 and cumulative distrin
bution funtion (cdf) F . Wn = i=1 Xi , n 1 are wai
1
Remarks about conditional expectation.
For a given event A with P (A) > 0 and a random variable X, the conditional expectation,
also called conditional mean, of X given A can be def ined as
E(X|A)