Math 310/Stat 410
Lecture 6
Fall, 2014
Section 2.3, 2.6 moment generating functions due Monday.
Non-identically distributed random variables.
Suppose that X is exponential (X ) and Y is exponential (Y
Math 310/Stat 410
Lecture 5
Fall, 2014
Maxima and Minima of Independent Continuous Random Variables
Say X1 , X2 , . . . are independent, identically distributed random variables with distribution func
Math 310/Stat 410
Lecture 16
Fall, 2014
Section 4.2 The distribution vector of Xn and the n-step transition
probability matrices
We see
P (X1 = j) =
!
P (X0 = i)Pij =
i
!
pi Pij .
!
pi Pij .
(0)
i
It
Math 310/Stat 410
Lecture 10
Fall, 2014
Section 3.3 The Continuous Case
Sec. 3.3 Continuous case:
There are two types of conditional probabilities and expectations in
the continuous case.
Say X is a c
Math 310/Stat 410
Lecture 8
Fall, 2014
The weak law of large numbers states that if X1 , . . . , Xn are uncorrelated with mean and variance 2 , then for any > 0,
or
n | > ) 0, as n .
P (|X
n | ) 1,
Math 310/Stat 410
Lecture 14
Fall, 2014
Let X0 , X1 , . . . , Xn , be a non-negative martingale. Then applying
Markovs inequality we have
E(Xn )
P (Xn c)
.
c
But E(Xn ) = E(X0 ), so we also have
E(X0
Math 310/Stat 410
Lecture 4
Fall, 2014
Section 2.3 Continuous random variables
Suppose that you observe the time until a component fails to the
exact number of hours. This will be random. The probabil
Math 310/Stat 410
Lecture 1
Fall, 2014
This course is titled Random Processes and their Applications. We
begin by asking, What is a random process? We know that a random
variable is a variable whose v
Math 310/Stat 410
Lecture 3
Fall, 2014
Homework: Sections 2.2, 2.4, 2.5.3 due Monday.
Answer to question at end of class: P (W4 7) = P (N7 4). Why
is that? Look at the sequence N1 , N2 , N3 , N4 , N5
Math 310/Stat 410
Lecture 2
Fall, 2014
Homework: Sections 2.2, 2.4, 2.5.3 will be due Monday.
Continuing with the interpretation of correlation.
Suppose that we wish to predict the value of Y by a con
Math 310/Stat 410
Lecture 13
Fall, 2014
Martingales
Let Y0 , Y1 , Y2 , . . . be a sequence of random variables. We say that
the sequence is a martingale if
E(Yn+1 | Y0 , . . . , Yn ) = Yn
for all n.
A
Math 310/Stat 410
Lecture 5
Fall, 2016
Homework: Assignments 3,4,5 from the syllabus will be due on Monday.
We can see how the normal approximation works for the binomial
distribution by comparing the
1
R commands for simulation of Bernoulli and Binomial random variables
> # We first simulate some Bernoulli random variables with p=.6.
> # A Bernoulli random variable is a binomial random variable wi
Math 310/Stat 410
Lecture 1
Fall, 2016
This course is titled Random Processes and their Applications. We
begin by asking, What is a random process? We know that a random
variable is a variable whose v
Math 310/Stat 410
Lecture 2
Fall, 2016
Homework due Monday. Numbers 1 and 2 from the syllabus. For
the R exercises, cut and paste your work from the computer onto a
word page and print it.
We just sho
Math 310/Stat 410
Lecture 1
Fall, 2016
This course is titled Random Processes and their Applications. We
begin by asking, What is a random process? We know that a random
variable is a variable whose v
Math 310/Stat 410
Lecture 3
Fall, 2016
Homework due Monday. Numbers 1 and 2 from the syllabus. For
the R exercises, cut and paste your work from the computer onto a
word page and print it.
Uncorrelate
Stat 410 Homework 1
1.Expectation, variance, Bernoulli random variables, binomial distribution
1.1. How many orderings are there of a standard 52-card deck?
> factorial(52)
[1] 8.065818e+67
3.1. > cho
Math 310/Stat 410
Lecture 17
Fall, 2014
The stationary distribution of a Markov chain is often denoted by .
In such a case we have P = .
Starting in equilibrium with the stationary distribution, the t
Math 310/Stat 410
Lecture 7
Fall, 2014
Multivariate normal distribution. Multivariate normal random
variables are linear combinations of IID standard normal random variables. We write the random varia
Math 310/Stat 410
Lecture 11
Fall, 2014
Conditional expectation obeys the same rules of expectation for each
xed x, since we are just using a conditional probability distribution
in place of a probabi
Math 310/Stat 410
Lecture 12
Fall, 2014
Sections 3.3, 3.4, 3.5 due Monday
Random Sums
Let X1 , X2 , . . . be a sequence of IID random variables with mean
and variance 2 .
Let N be independent of X1 ,
Math 310/Stat 410
Lecture 22
Fall, 2014
Gamblers ruin
Suppose that you and your opponent each begin with $k where the
probability of you winning each game is p < q. Then the probability
that you win i
Math 310/Stat 410
Syllabus
Text. Introduction to Probability Models, 11th edition, by Sheldon Ross.
Chapter 2. Random Variables
2.2
Discrete random variables # 23, 24, 31
2.4
Expectation and variance
Math 310/Stat 410
Lecture 30
Fall, 2014
Chapter 7: Renewal Theory and its Applications
Sections 7.1, 7.2:
Here we generalize the Poisson process. We have a series of events,
thought of as failures of
Math 310/Stat 410
Lecture 33
Fall, 2014
The Maximum Variable and the Reection Principle .
Let a > 0. Let Ta be the rst time that B(t) 1.
Let M (t) = max(B(u), 0 u t).
Then M (t) a is equivalent to Ta
Math 310/Stat 410
Lecture 9
Fall, 2014
Homework due Monday: Sections 2.6 Multivariate normal, 2.8, 3.2
Chapter 3. Conditional Probability and Expectation
Sec. 3.2 The Discrete CaseDue Monday.
X and Y
Math 310/Stat 410
Lecture 32
Fall, 2014
Homework: Sections 7.2, 7.3, 7.4, 10.1 due Monday.
Test 2 will be given in class next Friday, November 21. It will cover
Chapters 4,5, and 7.
Chapter 10. Browni