s3 - Stat 5101 Lecture Slides Deck 3 Charles J. Geyer...

Info iconThis preview shows pages 1–11. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Stat 5101 Lecture Slides Deck 3 Charles J. Geyer School of Statistics University of Minnesota 1 Deja Vu Now we go back to the beginning and do everything again. 2 Probability Mass Functions A probability mass function (PMF) is a function S f- R whose domain S , which can be any nonempty set, is called the sample space , whose codomain is the real numbers, and which satisfies the following conditions: its values are nonnegative f ( x ) , x S and sum to one X x S f ( x ) = 1 . (Exactly the same as slide 20, deck 1.) 3 Infinite Sample Spaces This time we allow infinite sample spaces. That means the sum X x S f ( x ) = 1 is an infinite series. So we are now using calculus. 4 Bernoulli Process A Bernoulli process is an infinite sequence of random variables X 1 , X 2 , ... (a stochastic process), that are IID Ber( p ). 5 Geometric Distribution The number of zeros (failures) before the first one (success) in a Bernoulli process is a random variable Y that has the geometric distribution with success probability p , denoted Geo( p ) for short. Clearly, Y takes values in N = { , 1 , 2 ,... } . Its PMF is given by f p ( y ) = Pr( Y = y ) because that is the formula for any PMF. 6 Geometric Distribution (cont.) If Y = y , then we know that the first y variables in the Bernoulli process have the value zero and that X y +1 = 1, and we dont know anything else about the rest of the infinite sequence X 1 , X 2 , ... . The probability of observing y failures and one success in that order is (1- p ) y p . There is no binomial coefficient, because there is only one order considered. Hence the PMF of the Geo( p ) distribution is f p ( y ) = p (1- p ) y , y = 0 , 1 , 2 ,.... 7 Geometric Distribution (cont.) With every brand name distribution comes a theorem that says the probabilities sum to one. For the geometric distribution, this theorem is X y =0 p (1- p ) y = 1 . This is a special case of the geometric series X n =0 s n = 1 1- s whenever- 1 < s < 1. Here s = 1- p . 8 Geometric Distribution (cont.) The geometric series only converges when- 1 < s < 1, which is- 1 < 1- p < 1, which is 0 < p < 2. Of course, we know p 1 because p is a probability. Thus the parameter space of the geometric family of distributions is { p R : 0 < p 1 } unlike the Bernoulli and binomial distributions p = 0 is not al- lowed. What goes wrong is that when we try to sum the infinite series X y =0 (1- p ) y = 1 + 1 + 1 + it does not converge. 9 Geometric Distribution (cont.) So we had to be careful. The phrase number of failures before the first success in a Bernoulli process does not define a random variable when the success probability is p = 0 because the first success never happens!...
View Full Document

This note was uploaded on 02/07/2012 for the course STAT 5101 taught by Professor Staff during the Fall '02 term at Minnesota.

Page1 / 162

s3 - Stat 5101 Lecture Slides Deck 3 Charles J. Geyer...

This preview shows document pages 1 - 11. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online