lecture10 - Probability Theory II (Setting Expectations for...

Info iconThis preview shows pages 1–4. Sign up to view the full content.

View Full Document Right Arrow Icon
1 Lecture 10 (September 29, 2011) Probability Theory II E[X+Y] = E[X] + E[Y] (Setting Expectations for Computer Scientists) Some useful sample spaces… 1) A fair coin sample space S = {H, T} Pr(H) = ½ , Pr(T) = ½. 2) A “bias - p” coin sample space S = {H, T} Pr(H) = p, Pr(T) = 1-p. 3) Two independent bias-p coin tosses sample space S = {HH, HT, TH, TT}   2 2 Pr ,1 , xx T T p T H p p H T p p H H p 3) n bias-p coins sample space S = {H,T} n if outcome x in S has k heads and n-k tails Pr(x) = p k (1-p) n-k n k Event E = {x in S | x has k heads} Pr(x) = p k (1-p) n-k x in E Pr(x) = “Binomial Distribution B(n,p)” Teams A is better than team B The odds of A winning are 6:5 What is the chance that A will beat B in the “best of 7” world series? i.e., in any game, A wins with probability 6/11
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
2 Team A beats B with probability p=6/11 in each game (implicit assumption: true for each game, independent of past.) Sample space S = {W, L} 7 Pr(x) = p k (1-p) 7-k if there are k W’s in x Want event E = “team A wins at least 4 games” E = {x in S | x has at least 4 W’s} Pr(E) = x in E Pr(x) 7 k k = 4 p k (1-p) 7-k = 7 = 0.5986… Question: Why is it permissible to assume that the two teams play a full seven-game series even if one team wins four games before seven have been played? Binomials and Gaussians… Given a fair coin (p = ½) Pr(see exactly k heads in n flips) = n k p k (1-p) n-k n k 2 -n = As n 1 , as a function of k/n, the plot for Pr(k heads) tends to “bell curve” or “Gaussian/normal distribution” Infinite sample spaces… A bias-p coin is tossed until the first time that a head turns up. sample space S = {H, TH, TTH, TTTH, …} The “Geometric” Distribution Pr G (k) = (sanity check) k≥1 Pr(k) = k≥1 (1-p) k-1 p = p * 1/(1-(1-p)) = 1 = p * (1 + (1-p) + (1-p) 2 + …) (shorthand S = {1, 2, 3, 4, …}) (1-p) k-1 p
Background image of page 2
3 A bias-p coin is tossed until the first time that a head turns up. sample space S = {1, 2, 3, 4, … } The Geometric Distribution Pr(E) = x in E Pr(x) = k even (1-p) k-1 p = p(1-p) * 1/(1-(1-p) 2 ) = (1-p)/(2-p) = p(1-p) * (1 + (1-p) 2 + (1-p) 4 + …) E = “first heads at even numbered flip” = {2, 4, 6, …} “= {TH, TTTH, TTTTTH, …}” Suppose the bias-p coin has been tossed n times and no head has shown up yet. Let’s call this event E n Memorylessness Pr D (t) = Pr[ we see heads for first time on (n+t)’th throw | E n ] Pr[E n ] = (1-p) n What is the distribution D of the number of further tosses t before we see heads? = Pr G (n+t) / Pr[E] = p (1-p) t-1 Thus D is the same geometric distribution! Expectations E[ ] “Expectation = (weighted) average” What is the average height of a 251 student? Sample space S = all 251 students (implicit) probability distribution = uniform average = (sum of heights in S)/|S| = ( t in S Height(t) )/|S| = t in S Pr(t) Height(t) Expectation Given a probability distribution D = (S, Pr) And any function f:S to reals E[ f ] = t in S f(t) Pr(t) A function from the sample space to the reals is called a “random variable” Random Variable A Random Variable is a function from S to reals Examples: X = value of white die in a two-dice roll X(3,4) = 3, X(1,6) = 1 Y = sum of values of the two dice Y(3,4) = 7, Y(1,6) = 7 Let S be sample space in a probability distribution
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 4
This is the end of the preview. Sign up to access the rest of the document.

Page1 / 12

lecture10 - Probability Theory II (Setting Expectations for...

This preview shows document pages 1 - 4. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online