cs221-section3

cs221-section3 - Probability Review CS 221 Section 3 Olga...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Probability Review CS 221 Section 3 Olga Russakovsky October 12, 2009 1 Random variables Consider running a probabilistic experiment, e.g., tossing a coin. Let be the set of all possible outcomes of this experiment, called the sample space . In this case, = { Heads , Tails } If we were to toss a coin 3 times, then = { HHH , HHT , HTH , HTT , THH , THT , TTH , TTT } A random variable is a function that map outcomes of a probabilistic experiment to a real number. More formally, a random variable X is a function X : R . For example, we can define a random variable X = the number of heads in two tosses of a coin (1) If a random variable X takes on only a finite number of values, i.e., X { , 1 , . . . n } , then it is called a discrete random variable . Otherwise, it is called a continuous random variable . The example above in (1) is a discrete random variable. The simplest example of a continuos random variable is a random number generator that can return any real number between 0 and 1. In CS221 we will be mostly dealing just with discrete variables. Many thanks to Quoc Le, Arian Maleki and Tom Do. For more details, please refer to the very thorough probability review handout found at http://cs229.stanford.edu/section/cs229-prob.pdf 1 2 Probability Distributions 2.1 Discrete random variables For a discrete random variable, a probability distribution is a list of probabilities associated with each of its possible values. For example, in the example above, if we have a fair coin, then P ( X = 0) = P ( TT outcome ) = 1 4 P ( X = 1) = P ( HT or TH outcome ) = 1 2 P ( X = 2) = P ( HH outcome ) = 1 4 For discrete variables, this is called the probability mass function , or PMF . We denote P ( X = x ) = P X ( x ) = p ( x ) (2) The requirements imposed on the probability mass function are: p ( x ) 0 for all x x p ( x ) = 1 Additionally, define an event A to be a subset of the sample space, i.e., A . For example, A might be the event that at least one toss turns up heads, or X 1. Then p X ( A ) = summationdisplay x A p X ( x ) (3) So in the case where A is the event of at least one heads, p X ( A ) = P ( X = 1) + P ( X = 2) = 1 2 + 1 4 = 3 4 Two common examples of discrete random variables are X Bernoulli ( p ) with 0 p 1 corresponds to taking a coin which has probability p of coming up heads, and flipping it once. X is 1 if heads comes up, 0 otherwise. p ( x ) = braceleftbigg p if x = 1 1 p if x = 0 (4) X Binomial ( n,p ) with 0 p 1 corresponds to taking a coin which has probability p of coming up Heads, and flipping it n times. X is the number of heads. p ( x ) = parenleftbigg n k parenrightbigg p x (1 p ) ( n- x ) (5) The example discussed above is Binomial (2 , 1 2 ). 2.2 Continous random variables In case of a continous random variable, the probability distribution is called the probability density function , or PDF , and is often denoted f ( x ). Here for example, can be the set of all)....
View Full Document

Page1 / 10

cs221-section3 - Probability Review CS 221 Section 3 Olga...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online