HW2, Math 468, Due 2/2
1. Markov Chains - Basic Concepts
Suppose that states have been labelled as s1 , s2 , s3 . so that we can think of a
probability distribution on the state space as a row vector
= (1 , 2 , .)
where 0 i = (si ) 1 is the probability o
HW9, MATH 468, DUE 4/26 OR 5/1
0.1. Basic properties of standard Brownian motion. (i) W (0) = 0.
(ii) W (t1 ) W (s1 ), W (t2 ) W (s2 ), ., W (tn ) W (sn ) are independent rvs for
0 s1 t1 s2 t2 . sn tn .
(iii) W (t) W (s) is a normal random variable with m
HW7, MATH 468, DUE 3/29
0.1. More Poisson Process Questions. Consider a Poisson process with parameter on 0 t < , where X (t) denotes the number of events that occur in
the time interval (0, t].
1. (Number 6 page 108) Find P (1 s|X (t) = n) for 0 s t and
HW6 SOLUTIONS, MATH 468, DUE 3/22
0.1. Some Basics of Chapter 3. In this summary, unless stated otherwise,
assume that the states in S are listed as s0 , s1 , . (where this may or may not be
nite). Arbitrary states are denoted by x, y, ., x0 , x1 , ., etc
HW5, MATH 468, DUE 3/8
0.1. Basic Results of Chapter 2.
Theorem 0.1. (a) An equilibrium vanishes on all transient and null recurrent
states.
(b) A positive recurrent communication class supports a unique equilibrium; the
unique equilibrium is given by
(y
HW4, MATH 468, DUE 2/16
0.1. Basic Notions of Chapter 2. Suppose that we are given a Markov chain.
List the states as s1 , s2 , ., and let p denote the probability transition matrix.
Denition. (a) A stationary distribution (or an equilibrium distribution)
HW3, MATH 468, DUE 2/9
0.1. Summary of Basic Results of Chapter 1. Here is a useful general
formula (page 19)
(0.1)
Ex (N (y ) = Ex (
Xn =y ) =
Ex (Xn =y ) =
Px (Xn = y ) =
pn (x, y )
n1
n1
n1
n1
Theorem 0.1. (Theorem 1, page 19) (i) Suppose that y is a t
HW2, Math 468, Due 2/2
1. Markov Chains - Basic Concepts
Suppose that states have been labelled as s1 , s2 , s3 . so that we can think of a
probability distribution on the state space as a row vector
= (1 , 2 , .)
where 0 i = (si ) 1 is the probability o
HW1, Math 468, Due 1/26
1. Notions from Discrete Probability
Key Concepts: probability space (usually denoted (, F , P ), and we may frequently suppress F ), conditional probability, independence, partition theorem, discrete random variable, probability m
First Exam Review, Math 468
1. Very Broad Overview
We think of a Markov chain as a noisy discrete time dynamical system with a
countable state space. As in dynamical systems, we are especially interested in
what happens to the system after a long time.
Su
EXAM 2 REVIEW, MATH 468, SPRING 2012
1. Bare Bones of Exam 1 Material
Suppose that cfw_Xn : n = 0, 1, 2, 3, . is a discrete time Markov chain with state
space S .
1.1. Basic Notions. The chain is determined by the initial distribution on S
and the probabi
HW1, Math 468, Due 1/26
1. Notions from Discrete Probability
Key Concepts: probability space (usually denoted (, F , P ), and we may frequently suppress F ), conditional probability, independence, partition theorem, discrete random variable, probability m
HW3, MATH 468, DUE 2/9
0.1. Summary of Basic Results of Chapter 1. Here is a useful general
formula (page 19)
(0.1)
Ex (N (y ) = Ex (
Xn =y ) =
Ex (Xn =y ) =
Px (Xn = y ) =
pn (x, y )
n1
n1
n1
n1
Theorem 0.1. (Theorem 1, page 19) (i) Suppose that y is a t
HW4, MATH 468, DUE 2/16
0.1. Basic Notions of Chapter 2. Suppose that we are given a Markov chain.
List the states as s1 , s2 , ., and let p denote the probability transition matrix.
Denition. (a) A stationary distribution (or an equilibrium distribution)
HW5, MATH 468, DUE 3/8
0.1. Basic Results of Chapter 2.
Theorem 0.1. (a) An equilibrium vanishes on all transient and null recurrent
states.
(b) A positive recurrent communication class supports a unique equilibrium; the
unique equilibrium is given by
(y
MATH 355-002
Spring 2012
HAND-IN HOMEWORK #8
(due Thursday, April 5)
EXERCISE 1. On page 288 (in Chapter 5, Section 4.6) appears a system (6.5) of ODEs that
model the growth of a microorganism in a chemostat. Figures 6.10 and 6.11 on page 289 show
phase p