Example 7. A poker hand, 5 cards dealt at random from a deck of 52 cards (4 suits 13
P(exactly 2 A in 5-card poker hand) =?.
Solution 1. In class, we said, we can think the sample space S as all possible combinations
of 52 cards tak
Bionomial Experiments: Model assumptions
Definition 1 (Definition 3.6). A binomial experiment possesses the following properties:
1. The experiment consists of n trials, where n is fixed in advance.
2. Each trial results in one of two outcomes, which are
Consider an equilateral triangle inscribed in a circle. Suppose a chord of the
circle is chosen at random. What is the probability that the chord is longer
than a side of the triangle?
A real number be
Will it rain?
On an average day in October, it rains in Corvallis with a chance of 0.4.
Weather forecasting is not always accurate:
When there will be a rain the next day, the forecast predicts the rain with probability
When there is
The expected value of a function of a random variable
Let Y be a discrete random variable with p.m.f. p(y) and let g(Y ) be a real-valued
function of Y . We define a new random variable by Z := g(Y ).
The expected value of Z = g(Y ) is given by
Example 1. An urn contains N marbles, r red, and N r black. Draw a random sample
without replacement of size n. Let Y be the number of red marbles in the sample.
Then Y has a hypergeometric distribution with parameters N , r an
Example 1. A thief steals. Each time he steals, he gets caught with a chance of 20%.
The first 3 times he gets caught, he will be excused. Beyond that, he will be sent to
prison for life. Assuming whether he gets caught is independent across thefts, how m
Recap: Laws of probability
The Additive Law (The Inclusion-Exclusion Principle)
P(A B) = P(A) + P(B) P(A B).
The Complement Law
P(A) = 1 P(A).
The Difference Law
If B A, then
P(A B) = P(A) P(B).
The Multiplicative Law (The Chain Rule)
P(A B) = P(A) P(
Definition 1 (Definition 4.6). If a < b, a random variable Y is said to have a continuous
uniform probability distribution on the interval (a, b), denoted by Y Unif(a, b), if and
only if the density function of Y is
1 , a y b,
Example 1. ( is a scaling parameter) Let Y Exp(). Define X =
. Then X
Proof. For any x > 0, P(X x) = P(
x) = P(Y x) = 1 e(x)/ = 1 ex .
This agrees with the c.d.f. of Exp(1).
Remark 1. If Y Exp() and c > 0 is a constant, then X :=
Read textbook 5.1 5.3
Multivariate probability distributions
A multivariate random variable is a collection of univariate random variablesall dened
on the same sample space.
A univariate random variable assigns one number to (measures one characteristic o
(Read textbook 3.9 and 4.9)
Moment generating functions
Definition 1. (Moments)
1. The k-th moment of a random variable Y (taken about the origin) is defined to be
E(Y k ) and is denoted by 0k .
2. The k-th central moment of a random variable Y (or k-th m
Bilinearity of Covariance (Recap)
Theorem 1. Let Y1 , Y2 , , Yn and X1 , , Xm be r.v.s, then
1. Let X and Y be random variables (not necessarily independent), and a, b, c and d
be constants. Then
cov(aX, Y ) = a cov(X, Y ),
cov(X + c, Y ) = cov(X, Y ),
The product rule of counting
The product rule (mn rule Page 41): the experiment takes k steps, the number of possible
outcomes in each step is ni , then the total number of outcomes is n=1 ni .
Example 1. Toss a coin 3 times, tot
Read textbook 2.10.
The Law of Total Probability
The law of total probability
P(A) = P(A B ) + P(A B ).
In terms of conditional probabilities,
P(A) = P(A|B ) P(B ) + P(A|B ) P(B ).
Generally, suppose the sample space is partitioned into B1 ,B2 , . . . , B
Conditional probability and independence
(Read textbook 2.7)
Conditional probability of an event A given that an event B (P(B ) > 0) has occurred:
P(A|B ) =
P( A B )
P( B )
Remark 1. When computing a conditional probability, we v
Laws (rules) of probability
(Read textbook 2.8, 2.9)
The Additive Law
P(A B ) = P(A) + P(B ) P(A B ).
For three events,
P(A B C ) = P(A) + P(B ) + P(C ) P(A B ) P(A C ) P(B C ) + P(A B C ).
Only when A and B are mutually exclusive or disjoint, do we have
Read textbook 4.1 - 4.4.
It is time to review your calculus!
Continuous random variables
Random experiments can have continuous outcomes:
X : body height of randomly selected person in a population.
Y : the waiting time at a bus stop.
Z : the time to f
Random variables and their distributions
Read textbook 2.11, 3.1 - 3.3
A random variable assigns a number to each sample point in the sample space (assigns a
number to each outcome of a random experiment). Formally, it is a function from
Denition 1 (Gamma function). The gamma function is dened as
y 1 ey dy.
Properties of the gamma function:
1. (1) = 1.
2. () = ( 1)( 1).
3. (n) = (n 1)!, for integers n = 1, 2, 3, . . . .
4. (1/2) = .
For property 2, use the integratio
Discrete probability models
A model is a simplied representation of some aspect of the reality.
Why simple probability models?
Capture common features in commonly seen random experiments.
Summarize probabilities dened on a complex sample space using jus
Read textbook 4.5.
We can dene a continuous distribution by specifying its probability density function (p.d.f.).
Denition 1. A random variable Y is said to have a normal distribution with parameters
, 2 , such that < < and > 0, if and
Definition 1. A random variable Y is said to have a geometric distribution with parameter p (0 < p 1) if and only if
p(y) = (1 p)y1 p,
y = 1, 2, 3, . . .
Geometric Distribution (p = 0.3)
Two events A and B are independent if
P(A B) = P(A) P(B)
When P(B) > 0, A, B are independent iff P(A|B) = P(A).
When P(A) > 0, A, B are independent iff P(B|A) = P(B).
A and the empty set are independent.
A and the sample space S are inde
Read textbook 6.1 6.5.
Functions of random variables
Given the distribution of Y (in terms of p.m.f., p.d.f. or c.d.f.), how do you determine
the distribution of a function, g(Y ), of Y ?
Given the joint distribution of Y1 , . . . , Yn (in ter