chapter8 - Chapter 8 Discrete Probability Distributions We...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Chapter 8 Discrete Probability Distributions We begin by defining a random variable . A random variable is a function or rule that assigns a number to each outcome of an experiment. Instead of talking about the coin flipping event as heads, tails, for example. we can think of it as 1 and 0, what we have called an indicator variable. This is actually a numerical event which can be called “the number of heads when flipping a coin.” There are two types of random variables; discrete random variable which take on a countable number of values, for example, sum of the roll of two dice 2,3,...,12), or a continuous random variable that take on values that are uncountable, that is on a real line, for example, time (we can speak of 30.1 minutes, 30.10001 minutes. etc.) A useful analogy is “integers are discrete, while real numbers are continuous.” Probability distributions are a listing of all possible outcomes of an experiment and the corresponding probability. A discrete distribution is based on random variables which can assume only clearly separated values. A continuous distribution can assume an infinite number of values within a given range. All this is not new - we have seen the histogram! 1 2 CHAPTER 8. DISCRETE PROBABILITY DISTRIBUTIONS 8.1 The discrete random variable When dealing with univariate data, we have seen how the histogram to- gether with its mean and standard deviation are a useful way to summarize data. In statistics, what in fact we are dealing with is a random variable , sometimes denoted r.v. A random variable is a function or rule that assigns a number to each outcome of an experiment. Random variables can be either discrete , i.e one that takes on only whole numbers like 0, 1, 2, ..., or continuous , i.e one that takes on any value on a Real line. A useful analogy is “Integers are discrete, while Real Numbers are continuous.” The outcomes of a random variable can be described by a probability distributions, which are a listing of all possible outcomes of an experiment and their corresponding probabilities. A discrete distribution is based on random variables which can assume only clearly separated values, while a continuous distribution is one that can assume an infinite number of values within a given range. Let us look at an example. Consider an experiment in which a coin is tossed three times. Let X be a r.v. representing the number of heads in 3 tosses. The possible outcomes are: Table 8.1: Outcomes of 3 tosses of a coin Outcome No. of heads T T T T T H 1 T H T 1 T H H 2 H T T 1 H H T 2 H T H 2 H H H 3 8.1. THE DISCRETE RANDOM VARIABLE 3 Note we have 8 equally likely outcomes from which we can easily con- struct the probability distribution of X , the number of heads in 3 tosses, as: Table 8.2: Probability distribution of 3 tosses of a coin X P(X) 1/8 or .125 1 3/8 or .375 2 3/8 or .375 3 1/8 or .125 The probability distribution, sometimes also referred to as the probability function, is in fact a histogram which looks as follows:...
View Full Document

This note was uploaded on 03/14/2010 for the course ECON Statistics taught by Professor Yy during the Spring '10 term at Seoul National.

Page1 / 11

chapter8 - Chapter 8 Discrete Probability Distributions We...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online