Practice Problems: September 22, 2014
1a. Variance of an Indicator. Suppose event A occurs with probability p, and X is an
indicator for A, i.e., X = 1 if A occurs, or X = 0 otherwise. We already know E(X) = p.
1b. Butteries. Al
Conditional density. The conditional density of X, given that Y = y, is only dened
for values y such that fY (y) > 0. For such y, we dene the conditional density of X, given
Y = y, as
fX,Y (x, y)
fX|Y (x|y) =
Notice that this equivalent to have
Conditional probability mass function, also known as the conditional PMF or the conditional mass. The conditional mass of X given Y = y is
pX|Y (x|y) = P (X = x | Y = y) =
pX,Y (x, y)
P (X = x, Y = y)
P (Y = y)
For all of this to make sense, ne
Expected value and variance of Binomial random variables
Perhaps the easiest way to compute the expected value of a Binomial random variable is
to use the interpretation that a Binomial(n, p) random variable is the sum of n independent
Using the joint mass to calculate the mass of a single random variable. For instance,
consider two random variables X and Y . If we sum the values of the joint mass pX,Y (x, y)
over all possible y values,
pX,Y (x, y) =
P (X = x, Y = y) = P (X = x, Y = a
Three equivalent views of the statement that X and Y are independent random variables.
You can treat any one of them as the denition, and show that the other two are equivalent
to the chosen statement.
1. The product of the masses of X and Y gives the joi
Expected value of a discrete random variable
Two ways to sum the terms to get the expected value of a random variable. Only dier
by how we group the values. These are completely equivalent ways.
1. Sum over all possible outcomes in the sample space.
Example of calculating the expected value of a discrete random variable
Let X be the number of girls born in 4 independent births of babies.
= 0P (cfw_(b, b, b, b))
+ 1P (cfw_g, b, b, b) + 1P (cfw_b, g, b, b) + 1P (cfw_b, b, g
Another nice fact, same as for discrete random variables: If X and Y are independent
(we should only be using this rule if we are sure we have independence), then
E(XY ) = E(X)E(Y ).
E(g(X)h(Y ) = E(g(X)h(Y ),
for any functions g and h. It
Bernoulli random variables, also called indicator random variables.
Take on only two values, namely, 0 or 1. We say X is Bernoulli(p) random variable if
X = 1 with probability p and X = 0 with probability q = 1 p.
E(X) = 1P (X = 1) + 0P (X
Joint probability mass function of two random variables, also known as the joint PMF
or the joint mass. The joint mass of two random variables gives the probability that each
of the random variables takes on the specied values. For example, pX,Y (3, 10) =
Poisson random variables
These random variables apply when there is some kind of process, for which the average
rate (say, per time period, for instance) is known. Need the occurrences in non-overlapping
time periods to be independent.
What is the mass of