Math 310/Stat 410
Lecture 6
Fall, 2014
Section 2.3, 2.6 moment generating functions due Monday.
Non-identically distributed random variables.
Suppose that X is exponential (X ) and Y is exponential (Y ). Then
P (min(X, Y ) > t) = P (X > t)P (Y > t) = eX t
Math 310/Stat 410
Lecture 5
Fall, 2014
Maxima and Minima of Independent Continuous Random Variables
Say X1 , X2 , . . . are independent, identically distributed random variables with distribution function F (x). We call them IID random variables. We are o
Math 310/Stat 410
Lecture 16
Fall, 2014
Section 4.2 The distribution vector of Xn and the n-step transition
probability matrices
We see
P (X1 = j) =
!
P (X0 = i)Pij =
i
!
pi Pij .
!
pi Pij .
(0)
i
It follows that in terms of vectors and matrix multiplicat
Math 310/Stat 410
Lecture 10
Fall, 2014
Section 3.3 The Continuous Case
Sec. 3.3 Continuous case:
There are two types of conditional probabilities and expectations in
the continuous case.
Say X is a continuous random variable with density f (x). We may
be
Math 310/Stat 410
Lecture 8
Fall, 2014
The weak law of large numbers states that if X1 , . . . , Xn are uncorrelated with mean and variance 2 , then for any > 0,
or
n | > ) 0, as n .
P (|X
n | ) 1, as n .
P (|X
There is a stronger result if we assume th
Math 310/Stat 410
Lecture 14
Fall, 2014
Let X0 , X1 , . . . , Xn , be a non-negative martingale. Then applying
Markovs inequality we have
E(Xn )
P (Xn c)
.
c
But E(Xn ) = E(X0 ), so we also have
E(X0 )
c
for all c. In fact more is true. Let Tc be the fir
Math 310/Stat 410
Lecture 4
Fall, 2014
Section 2.3 Continuous random variables
Suppose that you observe the time until a component fails to the
exact number of hours. This will be random. The probability that
the time is between 2.1 and 2.2 is small. The
Math 310/Stat 410
Lecture 1
Fall, 2014
This course is titled Random Processes and their Applications. We
begin by asking, What is a random process? We know that a random
variable is a variable whose value is unpredictable. For example, the
movement in a s
Math 310/Stat 410
Lecture 3
Fall, 2014
Homework: Sections 2.2, 2.4, 2.5.3 due Monday.
Answer to question at end of class: P (W4 7) = P (N7 4). Why
is that? Look at the sequence N1 , N2 , N3 , N4 , N5 , N6 , N7 . If N7 4,
then there will be a rst time that
Math 310/Stat 410
Lecture 2
Fall, 2014
Homework: Sections 2.2, 2.4, 2.5.3 will be due Monday.
Continuing with the interpretation of correlation.
Suppose that we wish to predict the value of Y by a constant c. We
wish to minimize the mean squared error, th
Math 310/Stat 410
Lecture 13
Fall, 2014
Martingales
Let Y0 , Y1 , Y2 , . . . be a sequence of random variables. We say that
the sequence is a martingale if
E(Yn+1 | Y0 , . . . , Yn ) = Yn
for all n.
A fundamental property of martingales is that E(Yn ) = E
Math 310/Stat 410
Lecture 5
Fall, 2016
Homework: Assignments 3,4,5 from the syllabus will be due on Monday.
We can see how the normal approximation works for the binomial
distribution by comparing the exact distribution with the normal approximating curve
1
R commands for simulation of Bernoulli and Binomial random variables
> # We first simulate some Bernoulli random variables with p=.6.
> # A Bernoulli random variable is a binomial random variable with n=1.
> rbinom(3, 1, .6)
[1] 1 0 0
> # We could estim
Math 310/Stat 410
Lecture 1
Fall, 2016
This course is titled Random Processes and their Applications. We
begin by asking, What is a random process? We know that a random
variable is a variable whose value is unpredictable. For example, the
movement in a s
Math 310/Stat 410
Lecture 2
Fall, 2016
Homework due Monday. Numbers 1 and 2 from the syllabus. For
the R exercises, cut and paste your work from the computer onto a
word page and print it.
We just showed that if Nn is binomial(n,p), then E(Nn ) = np.
In a
Math 310/Stat 410
Lecture 1
Fall, 2016
This course is titled Random Processes and their Applications. We
begin by asking, What is a random process? We know that a random
variable is a variable whose value is unpredictable. For example, the
movement in a s
Math 310/Stat 410
Lecture 3
Fall, 2016
Homework due Monday. Numbers 1 and 2 from the syllabus. For
the R exercises, cut and paste your work from the computer onto a
word page and print it.
Uncorrelated versus independent random variables.
Let X be a non-n
Stat 410 Homework 1
1.Expectation, variance, Bernoulli random variables, binomial distribution
1.1. How many orderings are there of a standard 52-card deck?
> factorial(52)
[1] 8.065818e+67
3.1. > choose(52,5)
[1] 2598960
3.2 > choose(52,13)
[1] 635013559
Math 310/Stat 410
Lecture 17
Fall, 2014
The stationary distribution of a Markov chain is often denoted by .
In such a case we have P = .
Starting in equilibrium with the stationary distribution, the transition matrix Q of the time reversed process is stat
Math 310/Stat 410
Lecture 15
Fall, 2014
Homework: Sections 3.6.3, 4.1, 4.2
From Section 3.6.6
Let X, X1 , X2 , . . . be IID random variables taking values 1, 0, 1, 2, . . . .
Let Sn = X1 + + Xn and assume > 0.
Then
E(Tk ) = kE(T1 ) and Var Tk = k Var T1 .
Math 310/Stat 410
Lecture 7
Fall, 2014
Multivariate normal distribution. Multivariate normal random
variables are linear combinations of IID standard normal random variables. We write the random variables as vectors. All vectors are column
vectors.
That i
Math 310/Stat 410
Lecture 11
Fall, 2014
Conditional expectation obeys the same rules of expectation for each
xed x, since we are just using a conditional probability distribution
in place of a probability distribution to compute it.
For example, we nd
E(a
Math 310/Stat 410
Lecture 12
Fall, 2014
Sections 3.3, 3.4, 3.5 due Monday
Random Sums
Let X1 , X2 , . . . be a sequence of IID random variables with mean
and variance 2 .
Let N be independent of X1 , . . . and take values 0, 1, 2, . . . .
We dene SN = X1
Math 310/Stat 410
Lecture 22
Fall, 2014
Gamblers ruin
Suppose that you and your opponent each begin with $k where the
probability of you winning each game is p < q. Then the probability
that you win is
1 (q/p)k
1
=
.
1 (q/p)2k
1 + (q/p)k
For simplicity, s
Math 310/Stat 410
Syllabus
Text. Introduction to Probability Models, 11th edition, by Sheldon Ross.
Chapter 2. Random Variables
2.2
Discrete random variables # 23, 24, 31
2.4
Expectation and variance # 49, 50
2.5.3 Covariance # 85
2.3
Continuous random va
Math 310/Stat 410
Lecture 30
Fall, 2014
Chapter 7: Renewal Theory and its Applications
Sections 7.1, 7.2:
Here we generalize the Poisson process. We have a series of events,
thought of as failures of components. At each failure the component is
replaced w
Math 310/Stat 410
Lecture 33
Fall, 2014
The Maximum Variable and the Reection Principle .
Let a > 0. Let Ta be the rst time that B(t) 1.
Let M (t) = max(B(u), 0 u t).
Then M (t) a is equivalent to Ta t.
We have
P (B(t) a) = P (Ta t B(t) a)
= P (Ta t)P (B(
Math 310/Stat 410
Lecture 9
Fall, 2014
Homework due Monday: Sections 2.6 Multivariate normal, 2.8, 3.2
Chapter 3. Conditional Probability and Expectation
Sec. 3.2 The Discrete CaseDue Monday.
X and Y are random variables with a joint distribution. Given
Y
Math 310/Stat 410
Lecture 32
Fall, 2014
Homework: Sections 7.2, 7.3, 7.4, 10.1 due Monday.
Test 2 will be given in class next Friday, November 21. It will cover
Chapters 4,5, and 7.
Chapter 10. Brownian Motion.
We motivated the Poisson process by consider