This preview shows pages 1–4. Sign up to view the full content.
This preview has intentionally blurred sections. Sign up to view the full version.
View Full DocumentThis preview has intentionally blurred sections. Sign up to view the full version.
View Full Document
Unformatted text preview: 4 Special Probability Distributions 4.1 Introduction 111 4.2 The Bernoulli Trial and Bernoulli Distribution 112 4.3 Binomial Distribution 113 4.4 Geometric Distribution 116 4.5 Pascal (or Negative Binomial) Distribution 122 4.6 Hypergeometric Distribution 126 4.7 Poisson Distribution 130 4.8 Exponential Distribution 133 4.9 Erlang Distribution 136 4.10 Uniform Distribution 141 4.11 Normal Distribution 144 4.12 The Hazard Function 150 4.13 Chapter Summary 153 4.14 Problems 155 4.1 Introduction Chapters 2 and 3 deal with general properties of random variables. Random vari ables with special probability distributions are encountered in different fields of science and engineering. The goal of this chapter is to describe some of these distributions, including their expected values and variances. These include the Bernoulli distribution, binomial distribution, geometric distribution, Pascal dis tribution, hypergeometric distribution, Poisson distribution, exponential distrib ution, Erlang distribution, uniform distribution, and normal distribution. 111 112 Chapter 4 Special Probability Distributions 4.2 The Bernoulli Trial and Bernoulli Distribution A Bernoulli trial is an experiment that results in two outcomes: success and fail ure . One example of a Bernoulli trial is the cointossing experiment, which results in heads or tails. In a Bernoulli trial we define the probability of success and prob ability of failure as follows: P [ success ] = p ≤ p ≤ 1 P [ failure ] = 1 p Let us associate the events of the Bernoulli trial with a random variable X such that when the outcome of the trial is a success, we define X = 1, and when the outcome is a failure, we define X = 0. The random variable X is called a Bernoulli random variable, and its PMF is given by p X ( x ) = braceleftbigg 1 p x = p x = 1 An alternative way to define the PMF of X is as follows: p X ( x ) = p x ( 1 p ) 1 x x = , 1 The PMF of X can be plotted as shown in Figure 4.1. The CDF is given by F X ( x ) = braceleftBigg x < 1 p ≤ x < 1 1 x ≥ 1 The expected value of X is given by E [ X ] = ( 1 p ) + 1 ( p ) = p Figure 4.1 PMF of the Bernoulli Random Variable 4.3 Binomial Distribution 113 Similarly, the second moment of X is given by E bracketleftbig X 2 bracketrightbig = 2 ( 1 p ) + 1 2 ( p ) = p Thus, the variance of X is given by σ 2 X = E bracketleftbig X 2 bracketrightbig { E [ X ]} 2 = p p 2 = p ( 1 p ) 4.3 Binomial Distribution Suppose we conduct n independent Bernoulli trials, and we represent the number of successes in those n trials by random variable X ( n ) . Then X ( n ) is defined as a binomial random variable with parameters ( n , p ) . The PMF of a random variable X ( n ) with parameters ( n , p ) is given by p X ( n ) ( x ) = parenleftbigg n x parenrightbigg p x ( 1 p ) n x x = , 1 , 2 ,..., n The binomial coefficient, parenleftbigg n x parenrightbigg , represents the number of ways of arranging x suc cesses and n x failures. The shape of the PMF of X...
View
Full
Document
This note was uploaded on 01/05/2010 for the course STAT 350 taught by Professor Carlton during the Fall '07 term at Cal Poly.
 Fall '07
 Carlton
 Bernoulli, Binomial, Probability

Click to edit the document details