Math 310/Stat 410
Homework 8
Section 4.4.
Problem 19. The Markov chain
transition matrix
.7 0
.5 0
0 .4
0 .2
Fall, 2015
with states RR, CR, RC, CC has
.3
.5
0
0
0
0
.6
.8
The question is , what proportion of days does it rain. This is the
proportion of d
Math 310/Stat 410
Lecture 11
Fall, 2015
We generalize the problem of injuries in accidents as follows: Suppose
that N is Poisson() and suppose that conditional on N = n, X is
binomial(n,p). Let Y = N X. Here again, N could be the number of
accidents, X th
Math 310/Stat 410
Lecture 12
Fall, 2015
Homework: Sections 3.3, 3.4
Conditional density functions
Example 1. Suppose that (X, Y ) is uniformly distributed over the
triangular region 0 y x 1 with joint density f (x, y) = 2 over
this set.
x
Then f (x) = 0 2
Math 310/Stat 410
Lecture 13
Fall, 2015
Random Sums
We apply these ideas to sums of a random number of independent,
identically distributed random variables.
Let X1 , X2 , . . . be a sequence of IID random variables with mean
and variance 2 .
Let N be in
Math 310/Stat 410
Lecture 5
Fall, 2015
Problem 41: Correction to my solution. Consider a sequence of n
coin tosses where P(X)=p and P(T)=q. This might represent up and
down in the stock market from day to day. Let X be the number of
switches from day to d
Math 310/Stat 410
Lecture 7
Fall, 2015
Section 2.6. Multivariate normal distributions. Multivariate
normal random variables are linear combinations of IID standard normal random variables. We write the random variables as vectors. All
vectors are column v
Math 310/Stat 410
Lecture 3
Fall, 2015
Homework due Monday: Sections 2.2, 2.3, 2.4, 2.5 (not 2.5.3)
Section 2.3 Continuous random variables
The expectation for a continuous random variable is given by
E(X) = xf (x)dx and has the same interpretation as the
Math 310/Stat 410
Lecture 10
Fall, 2015
Continuation of the betting problem. The probability of winning
each game is .6 and the probability of losing is .4. Suppose that each
time we bet 20% of our assets. Then Yn = Y0 X1 Xn , where P (Xi =
1.2) = .6 and
1
Math 310
Stat 410
Random Processes and Applications
Fall, 2015
Class Meetings: M-W-F 3:10-4:00 in 201 X-S
Instructor: Professor Eisenberg, 237 XS, e-mail be01, tel. x83736.
Office hours: Mon-Weds-Fri 1:30-2:30.
Text: Introduction to Probability Models,
Math 310/Stat 410
Lecture 6
Fall, 2015
Homework due Monday: Sections 2.5.3, 2.6 (Moment generating
functions)
Sums of Independent Random Variables-the discrete case
Assume that X and Y are random variables taking values 0, 1, 2, . . . .
Let T = X + Y. The
Math 310/Stat 410
Lecture 4
Fall, 2015
Section 2.5.3 Covariance and Correlation The mean and standard deviation are parameters which describe the distribution of a random variable. The mean gives a type of center of the distribution and
the standard devia
Math 310/Stat 410
Lecture 2
Fall, 2015
Section 2.3 Continuous random variables
Suppose that you observe the time until a component fails to the
exact number of hours. This will be random. The probability that
the time is between 2.1 and 2.2 is small. The
Math 310/Stat 410
Lecture 9
Fall, 2015
Homework: Sections 2.6 (Multivariate Normal) and 2.8 adn 3.2
Central Limit Theorem:
lim P (
n
Sn n
z) = (z)
n
for all z.
If we let Fn (z) be the cumulative distribution function of
S
n n
,
n
then
Fn (z) (z)
for all
Math 310/Stat 410
Lecture 14
Fall, 2015
On some notation for conditional expectation and conditional variance. Let E(Y | X = x) = h(x). Then h(x) is a function of the real
variable x. There is nothing random about it. For each x it is a number, so if we w
Math 310/Stat 410
Lecture 15
Fall, 2015
Homework: Sections 3.5, 3.6.3
Test 1: Friday. October 2 in class. Chapters 2 and 3.
Section 3.6.3 Uniform priors, Polyas Urn Model, and Bose
Einstein Statistics
We next consider a model with random variables (N,Y),
Math 310/Stat 410
Lecture 16
Fall, 2015
Homework: Section 4.1 due Monday
Chapter 4. Markov Chains
Section 4.1 Introduction
A sequence of random variables is called a stochastic process. In
studying stochastic processes we need to know the joint distributi
Math 310/Stat 410
Practice Test 1
Fall, 2015
1. Teams A and B play a series of games until one team has won
ve games. In each game team A has probability of winning .6 and
team B has probability of winning .4. The results of dierent games
are independent.
NIath 310 /Stat 410 Test 2A] Fan, 2014
Name .
Each problem is worth 20 points. Show your work!
Score
Total: Tie-253' 2% 49922-
2
1. A Markov chain X,“ 17, m 0,1,. with 3 states has stationary
transition probabiiity matrix P. The distribution vector of X0
1. X and Y are random variables with
2: #‘X! ﬁty,\fal' X : a}, \far Y’ I a;
and correlation p.
21) Express Va‘1‘(Y—pyﬂc(Xfp in terms of c, ,LLX, My, 0);, 0y, and p. Problem 1 continued.
b) Derive a formula for the value of c which minimizes this variance
Math 310/Stat 410
Practice Test 2
Fall, 2013
1. A 3 3 matrix M has the property that
lim M n (i, j) = cj , for all (i, j).
n
Show that
3
ci M (i, j) = cj , for all j.
i=1
ar stu
ed d
vi y re
aC s
o
ou urc
rs e
eH w
er as
o.
co
m
2. A Markov chain Xn , n =
Math 310/Stat 410
Practice Test 2
Fall, 2014
1. An irreducible, aperiodic Markov chain Xn , n = 0, 1, . . . with 3
states has transition probability matrix P. Starting with the fact that
lim P (Xn = j | X0 = i) = j , for i = 1, 2, 3 j = 1, 2, 3,
n
show th
Math 310/Stat 410
Practice Test 2 Solutions
Fall, 2014
1. An irreducible, aperiodic Markov chain Xn , n = 0, 1, . . . with 3
states has transition probability matrix P. Starting with the fact that
lim P (Xn = j | X0 = i) = j , for i = 1, 2, 3 j = 1, 2, 3,
Math 310/Stat 410
Practice Final Test
Fall, 2014
1. X, Y and Z are independent, exponential random variables with
means 2,2, and 4, respectively.
a) Find P (2Y < Z).
b) Find P (X + Y < Z).
2. Let X and Y be random variables with a bivariate normal distrib
Math 310/Stat 410
Practice Test 1
Fall, 2014
1. X and Y are independent Poisson random variables. Let T = X +Y.
Show that T has a Poisson distribution.
2. Assume that X and Y are independent, exponential random variables with E(X) = 6 and E(Y ) = 2.
a) Fi
Math 310/Stat 410
Lecture 33
Fall, 2015
Homework: Section 5.4.
Sections 7.2, 7.3, 7.4 due Monday
Decomposition of Poisson processes .
The probability that the rst event is a type 1 event is p. The
number of events G until the rst type one event is geometr
Math 310/Stat 410
Lecture 18
Fall, 2015
Section 4.2
Some variations on P (Xn = j | X0 = i).
Suppose we wish to nd P (Xk = j at least once up to time n| X0 =
i) That is, what is the probability that state j is visited at some time
up to time n. Then we can
Math 310/Stat 410
Lecture 31
Fall, 2015
Test 2: Friday, November 13.
Homework Section 5.3 Part II, but omit 65, 66.
Exercise 3: Let T be the time of the rst event in a Poisson process
N (t), t 0. Find P (T < 1 | N (3) = 1).Solutions:
P (T < 1 | N (3) = 1)
Math 310/Stat 410
Lecture 20
Fall, 2015
We were studying the two state Markov chain with transition matrix.
The two state chain. Let
1r
r
P =
s
1s
We found the stationary distribution given by
r
s
,
).
=(
r+s r+s
Except for the cases
1 0
P =
0 1
and
0 1
1
Math 310/Stat 410
Lecture 19
Fall, 2015
Homework: Section 4.2
We have seen that
P (n) = P n .
There are some cases where we can nd formulas for the entries of
P n.
Example: A message is sent. The message is either 0 or 1. Whatever
the message is, the prob