Chapter 5: Discrete Probability DistributionsDiscrete vs. Continuous Random VariablesDiscreteContinuousDefinition:Discrete random variables haveoutcomes that typically take on whole #’sas a result of conducting an experiment.Definition:Continuous random variableshave outcomes that take on anynumerical value as a result of conductingan experiment.Examples: # of people coming into a store onehour after it first opens in the morning.Examples: The length of time someone waits inline at the grocery store.List the rules for a Discrete Probability Distribution:1.Each outcome in the distribution needs to be mutually exclusive with other outcomes in the distribution2.P(x) must be between 0 and 13.The sum of thePfor all outcomes must add up to 1The mean (or expected value) of a discrete probability distribution is simply the weightedaverage of the possibleoutcomes:Formula for the Mean/Expected Value of a Discrete Probability Distribution:Where:µ = the meanxi
You've reached the end of your free preview.
Want to read all 7 pages?
Probability theory,#,Discrete probability distribution,µ,one