Sanity check with respect to expected value. Sometimes it is case that there are upper
and lower bounds, say M and m, so that m X M all the time. Then we compute the
expected value of X, it is enough to just integrate from m to M . So
xfX (x) dx
Expected value of a discrete random variable
Two ways to sum the terms to get the expected value of a random variable. Only dier
by how we group the values. These are completely equivalent ways.
1. Sum over all possible outcomes in the sample space.
Three equivalent views of the statement that X and Y are independent random variables.
You can treat any one of them as the denition, and show that the other two are equivalent
to the chosen statement.
1. The product of the masses of X and Y gives the joi
Using the joint mass to calculate the mass of a single random variable. For instance,
consider two random variables X and Y . If we sum the values of the joint mass pX,Y (x, y)
over all possible y values,
pX,Y (x, y) =
P (X = x, Y = y) = P (X = x, Y = a
Expected value and variance of Binomial random variables
Perhaps the easiest way to compute the expected value of a Binomial random variable is
to use the interpretation that a Binomial(n, p) random variable is the sum of n independent
Conditional probability mass function, also known as the conditional PMF or the conditional mass. The conditional mass of X given Y = y is
pX|Y (x|y) = P (X = x | Y = y) =
pX,Y (x, y)
P (X = x, Y = y)
P (Y = y)
For all of this to make sense, ne
Example of calculating the expected value of a discrete random variable
Let X be the number of girls born in 4 independent births of babies.
= 0P (cfw_(b, b, b, b))
+ 1P (cfw_g, b, b, b) + 1P (cfw_b, g, b, b) + 1P (cfw_b, b, g
Another nice fact, same as for discrete random variables: If X and Y are independent
(we should only be using this rule if we are sure we have independence), then
E(XY ) = E(X)E(Y ).
E(g(X)h(Y ) = E(g(X)h(Y ),
for any functions g and h. It
Bernoulli random variables, also called indicator random variables.
Take on only two values, namely, 0 or 1. We say X is Bernoulli(p) random variable if
X = 1 with probability p and X = 0 with probability q = 1 p.
E(X) = 1P (X = 1) + 0P (X
Joint probability mass function of two random variables, also known as the joint PMF
or the joint mass. The joint mass of two random variables gives the probability that each
of the random variables takes on the specied values. For example, pX,Y (3, 10) =
Poisson random variables
These random variables apply when there is some kind of process, for which the average
rate (say, per time period, for instance) is known. Need the occurrences in non-overlapping
time periods to be independent.
What is the mass of
Conditional density. The conditional density of X, given that Y = y, is only dened
for values y such that fY (y) > 0. For such y, we dene the conditional density of X, given
Y = y, as
fX,Y (x, y)
fX|Y (x|y) =
Notice that this equivalent to have
a. Find P (X 2 + Y 2 1)
b. Findtheconditionalprobability P (X 2 + Y 2 1|X + Y 1)
c. Find P (Y X 2 )