Chapter5_ReadingOutline.docx - Chapter 5 Discrete Probability Distributions Discrete vs Continuous Random Variables Discrete Continuous Definition

Chapter5_ReadingOutline.docx - Chapter 5 Discrete...

This preview shows page 1 - 2 out of 7 pages.

Chapter 5: Discrete Probability Distributions Discrete vs. Continuous Random Variables Discrete Continuous Definition: Discrete random variables have outcomes that typically take on whole #’s as a result of conducting an experiment. Definition: Continuous random variables have outcomes that take on any numerical value as a result of conducting an experiment. Examples: # of people coming into a store one hour after it first opens in the morning. Examples: The length of time someone waits in line at the grocery store. List the rules for a Discrete Probability Distribution: 1. Each outcome in the distribution needs to be mutually exclusive with other outcomes in the distribution 2. P (x) must be between 0 and 1 3. The sum of the P for all outcomes must add up to 1 The mean (or expected value) of a discrete probability distribution is simply the weighted average of the possible outcomes: Formula for the Mean/Expected Value of a Discrete Probability Distribution: Where: µ = the mean x i
Image of page 1
Image of page 2

You've reached the end of your free preview.

Want to read all 7 pages?

  • Fall '12
  • Donnelly
  • Probability theory, #, Discrete probability distribution, µ, one hour

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture