Worksheet # 1
Statistics 150, Pitman, Spring 2013
Review of basic probability framework. Limits of random variables. Reading: All of Chapter
1. Focus: 1.5.5 MGFs and PGFs (z-transforms), 1.6.3 Cherno bounds, 1.7.5/6 Convergence
of RVs. Monotone Convergenc
25 February 2013
This Week.
First passage and return distributions for random walks
Mostly simple (p , q ) random walks
Symmetric simple random walks: p = q = 1
2
Exercising generating function skills
Walds identity
Condence dealing with random time
10 April, 2013
Queuing Models. General idea: customers arrive at a service station expecting a service.
They queue up for service, receive the service, then depart. The classic classication of
queues follows the / / format. Typically in the rst dot goes M
12 April 2013
Laplace Transforms. Consider a random variable X 0. Its Laplace transform is () :=
X () = E[eX ].
Motivation. The Laplace Transform is a kind of moment generating function. Recall that
a moment generating function is E[eX ], for some real (o
15 April 2013
Feynman-Kac Formula.
The problem. Let (Xt , t 0) be a continuous time Markov chain with state space S B ,
where
S is a set of internal states.
B is a nonempty set of boundary (absorbing) states.
Assume it is possible to reach B from any in
17 April 2013
Today.
Complete proof of F-K formula
Discuss issues around transforms
Prepare for discussion of continuous time / space processes
If you want to get ahead, read the Gaussian section, prepare for Brownian motion
F-K Formula. From last time, w
19 April 2013
Last Couple Weeks.
Gaussian Processes and Brownian Motion (Ch. 3)
Martingales (Ch. 9)
Brownian Motion.
Motivation.
Understand scaling limits of random variables
Direct modeling of stock market, etc.
(Bt , t 0) will be a continuous time a
8 April 2013
Wrap Up Ehrenfest Urn. Recall: in this model, we have N labeled balls, two urns labeled
I and II . Let Xt be the number of balls in urn I at time t. Each particle ips boxes at rate
1, i.e. each particle has its own Poisson process of rate 1.
5 April 2013
Last time, we developed insight into structure of a two-state chain in continuous time.
0
1
The insight:
0
1
+
1) The steady state equilibrium is given by:
+
2) If you run a PP( + ), you can thin to get the transitions, i.e. the types of arr
3 April 2013
Brief Review of Last Class. Any continuous time Markov chain (X (t), t 0) with nite
state space S can be constructed as X (t) = YN (t) , where N is a Poisson Process with
parameter , and Y is a jumping chain with transition matrix in discrete
27 February 2013
Today.
Consolidate results about simple random walk rst passage time
Walds identity (random sums specically)
Simple Random Walks.
We have found the generating function, and hence the distribution, of T0 , which is dened
as the rst hitti
1 March 2013
Today.
Sums of a random number of random variables
Branching Processes
Branching Processes. Branching process inspires a picture of a tree:
Each line represents a generation, with the rst generation at the left.
How do we model these proces
4 March 2013
Today / Announcements.
Continue Branching Processes
Consolidate techniques for working with generating functions and absorption probabilities
Need a tutor? Click here
As the GSI said in section, Pitmans notes on the Branching Process, spe
6 March 2013
Today.
Practice Problem
Branching Processes and Random Walks
Practice Problem. Sketch the binomial(n, 1 ) generating function for n = 1, 2, 3, 4. An2
swer:
1/2
1/4
1/8
1/16
To see this, notice that the general form of the generating functio
8 March 2013
Today / Announcements.
Exercise
Wrap up discussion of branching processes and random walks
Midterms: 2010 MT w/ Solns, 2009 MT w/o Solns, 2006 MT w/o Solns
Exercise. Plot the probability generating function for a geometric random variable
20 March 2013
Today. Continue continuous time Markov Chains
Poisson Process on the Line
Connections to Renewal Theory
Poisson Process on the Line. Consider a process on the line with rate 1 per minute. Let
Tn := X1 + + Xn . Maybe its a really busy bus s
22 March 2013
Reading.
Poisson Process
Renewal Theory (Limit Theorems)
Pitmans notes:
Poisson Process
Jump Hold Description
Transition Semigroups
Limit Distribution of Continuous Time Markov Chains
Poissonization of Discrete Time Markov Chains
Ge
1 April 2013
Brief Review of Poisson Methods / Tools / Tricks (Must Know).
Let us interpret this picture. Remember:
1
U4
U1
U5
U3
p
U2
U6
0
T1
T2 T3
T4
T5
T6
P P ( )
=+
This is the important picture for this lecture
T0 = 0 < T1 < T2 <
(Ti Ti1 , i 1) a
Worksheet # 5
Statistics 150, Pitman, Spring 2013
Topics: First Return Probabilities for Simple Random Walk , Occupation Times for Symmetric Simple Random Walk , Recurrence/Transience for Simple Random Walk , Stopping
Times and Walds Identity
Reading: Top
Stat 150 Homework # 8 Solutions
1: If the density function of a distribution has the form ck, xk1 ex , then ck, must be the
normalizing constant xk11 x dx and immediately from the functional form of the density
e
(i.e. being some constant times xk1 ex ) w
Stat 150 Midterm 1 Spring 2015
Instructor: Allan Sly
Name:
SID:
There are 4 questions worth a total of 48 points. Attempt all questions and show your
working - solutions without explanation will not receive full credit. Answer the questions in
the space p
Stat 150 Spring 2015 Syllabus
Available online at http:/www.stat.berkeley.edu/~sly/Stat150Spring2015Syllabus.pdf
Instructor: Allan Sly
GSI: Jonathan Hermon
Course Webpage: http:/www.stat.berkeley.edu/~sly/STAT150.html
Class Time: MWF 12:00 - 1:00 PM in ro
Stat 150 Homework # 6 Solutions
1: Denote t0 = 0 = x0 , si := ti ti1 and yi = xi xi1 . Then 3 cfw_N (ti ) = xi =
i=1
3
i=1 cfw_N ([ti ti1 ) = yi (up to an event of 0 probability in which there was an arrival in
one of the times cfw_t1 , t2 , t3 . Thus b
Stat 150 Homework # 9 Solutions
1: By the stationarity of BM this is like 2 + W1 given that W2 = 2.
1
First way - using scaling: Bt = 2 W2t is a standard BM if Wt is. So what we have is like
=
2 + 2B1/2 given that B1 = 2. You saw in lecture that given tha
Stat 150 Homework # 4 Solutions
1: Let X Geometric(p). Then
p[(1 p)s]k =
GX (s) =
k=0
p
,
1 (1 p)s
Let Y1 , Y2 , . . . be i.i.d. Bernoulli(q) random variables, such that X, Y1 , Y2 , . . . , are independent. Dene Y := X Yi . Then Y Bin(X, q). We have that
Stat 150 Homework # 7 Solutions
A summary of some useful facts:
A continuous-time Markov chain (Yt )tR+ on a finite state space can be described in several
ways:
(i) By its generator G. In which case Ht (x, y) := Pr[Yt = y | Y0 = x] = (etQ )x,y . In which
Stat 150 Homework # 5 Solutions
1: Recall that Zn /n is a martingale. In particular, by the conservation of expectation
property of martingales, E[Zn ] = n . The total number of individuals in all the generations
combined is simply n0 Zn . Consequently,
E