This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: Probability Theory - A Mathematical Basis for Making Decisions under Risk and Uncertianty: Lecture III Charles B. Moss August 24, 2010 I. Introduction A. In the vernacular of the statistician the unknown or unknowable event is called a random variable . 1. The observed value of a random variable is then referred to as an observation or outcome of the random variable. 2. If the outcome is known in advance then the process is not random, but deterministic or certain . 3. An event whose outcome is not known with certainty is called random or stochastic . 4. Random variables with finite numbers of outcomes are typi- cally referred to as discrete random variables . 5. The alternative to a discrete random variable is a continuous random variable . B. Intuitively, we define the probability of an event as the relative likelihood that the event will occur. 1. The standard reasoning is that two events of the discrete ran- dom variable are possible (heads or tails). 2. If the coin is fair we would anticipate that the two events are equally likely....
View Full Document
This note was uploaded on 07/15/2011 for the course AEB 6182 taught by Professor Weldon during the Fall '08 term at University of Florida.
- Fall '08