ch1 - 1 Chapter 1 Special Distributions 1. Special...

Info iconThis preview shows pages 1–5. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Chapter 1 Special Distributions 1. Special Distributions Bernoulli, binomial, geometric, and negative binomial Sampling with and without replacement; Hypergeometric Finite sample variance correction Poisson and an “informal” Poisson process Stationary and independent increments Exponential and Gamma; Strong Markov property Normal, and the classical CLT; Chi-square Uniform, beta, uniform order statistics Cauchy Rademacher, and symmetrization Multinomial, and its moments 2. Convolution and related formulas Sums, products, and quotients Student’s t ; Snedecor’s F ; and beta 3. The multivariate normal distribution Properties of covariance matrices Characteristic function Marginals, independence, and linear combinations Linear independence The multivariate normal density Conditional densities Facts about Chi-square distributions 4. General integration by parts formulas Representations of random variables Formulas for means, variances, and covariances via integration by parts
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2
Background image of page 2
Chapter 1 Special Distributions 1S p ecial Distributions Independent Bernoulli Trials If P ( X =1)= p =1 P ( X = 0), then X is said to be a Bernoulli ( p ) random variable. We refer to the event [ X ]as success, and to [ X =0 ]as failure. Let X 1 ,...,X n be i.i.d. Bernoulli( p ), and let S n = X 1 + ··· + X n denote the number of successes in n independent Bernoulli( p ) trials. Now P ( X i = x i ,i ,...,n )= p n 1 x i (1 p ) n n 1 x i if all x i equal 0 or 1; this formula gives the joint distribution of X 1 n .From this we obtain P ( S n = k µ n k p k (1 p ) n k for k ,...,n, (1) since each of the ( n k ) di±erent placings of k 1’s in an n vector containing k 1’s and n k 0’s has probability p k (1 p ) n k from the previous sentence. We say that S n Binomial ( n, p ) when (1) holds. Note that Binomial(1 ,p )is the same as Bernoulli( p ). Let X 1 ,X 2 ,... be i.i.d. Bernoulli( p ). Let Y 1 W 1 min { n : S n } . Since [ Y 1 = k ]=[ X 1 = 0 k 1 k = 1], we have P ( Y 1 = k )=(1 p ) k 1 p for k , 2 ,... . (2) We say that Y 1 Geometric ( p ). Now let W m min { n : S n = m } .W e call W m the waiting time to the m-th success . Let Y m W m W m 1 for m 1, with W 0 0; we call the Y m ’s the interarrival times . Note that [ W m = k S k 1 = m 1 k = 1]. Hence P ( W m = k µ k 1 m 1 p m (1 p ) k m for k = m, m +1 (3) We say that W m Negative Binomial ( m, p ). Exercise 1.1 Show that Y 1 ,Y 2 are i.i.d. Geometric( p ). Since the number of successes in n 1 + n 2 trials is the number of successes in the ²rst n 1 trials plus the number of successes in the next n 2 trials, it is clear that for independent Z i Binomial( n i ), Z 1 + Z 2 Binomial ( n 1 + n 2 ) . (4) 3
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
4 CHAPTER 1. SPECIAL DISTRIBUTIONS Likewise, for independent Z i Negative Binomial ( m i ,p ), Z 1 + Z 2 Negative Binomial ( m 1 + m 2 ) . (5) Urn Models Suppose that an urn contains N balls of which M bear the number 1 and N M bear the number 0. Thoroughly mix the balls in the urn. Draw one ball at random. Let X 1 denote the number on the ball. Then X 1 Bernoulli( p ) with p = M/N .Now replace the ball back in
Background image of page 4
Image of page 5
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 04/14/2010 for the course STATS 610 taught by Professor Moulib during the Fall '09 term at University of Michigan.

Page1 / 19

ch1 - 1 Chapter 1 Special Distributions 1. Special...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online