This preview shows page 1. Sign up to view the full content.
Unformatted text preview: integral as a sum where X is a nonnegative integer random variable. b) Generalize the above integral for the case of an arbitrary (rather than nonnegative) rv
Y with distribution function FY (y ); use a graphical argument.
c) Find E [Y ] by the same type of argument.
d) For what value of α is E [Y − α] minimized? Use a graphical argument again.
Exercise 1.8. a) Let Y be a nonnegative rv and y > 0 be some ﬁxed number. Let A be
the event that Y ≥ y . Show that y IA ≤ Y (i.e., that this inequality is satisﬁed for every
ω ∈ ≠).
b) Use your result in part a) to prove the Markov inequality.
Exercise 1.9. Use the deﬁnition of a limit in the proof of Theorem 1.2 to show that the
sequences in parts a and b satisfy limn→1 an = 0 but the sequence in part c does not have
a limit.
a) an = 1
ln(ln n) b) an = n10 exp(−n) c) an = 1 for n = 10k for each positive integer k and an = 0 otherwise.
d) Show that the deﬁnition can be changed (with no change in meaning) by replacing δ
with either 1/m or 2−m for every positive integer m. 52 CHAPTER 1. INTRODUCTION AND REVIEW OF PROBABILITY Exercise 1.10. Let X be a rv with distribution function FX (x). Find the distribution
function of the following rv’s.
a) The maximum of n IID rv’s with distribution function FX (x).
b) The minimum of n IID rv’s with distribution FX (x).
c) The diﬀerence of the rv’s deﬁned in a) and b); assume X has a density fX (x).
Exercise 1.11. a) Let X1 , X2 , . . . , Xn be rv’s with expected values X 1 , . . . , X n . Prove
that E [X1 + · · · + Xn ] = X 1 + · · · + X n . Do not assume that the rv’s are independent.
b) Now assume that X1 , . . . , Xn are statistically independent and show that the expected
value of the product is equal to the product of the expected values.
c) Again assuming that X1 , . . . , Xn are statistically independent, show that the variance
of the sum is equal to the sum of the variances.
Exercise 1.12. Let X1 , X2 , . . . , Xn , . . . be a sequence of IID continuous rv’s with the
common probability density function fX (x); note that Pr {X =α} = 0 for all α and that
Pr {Xi =Xj } = 0 for all i 6= j . For n ≥ 2, deﬁne Xn as a recordtodate of the sequence if
Xn > Xi for all i < n.
a) Find the probability that X2 is a recordtodate. Use symmetry to obtain a numerical
answer without computation. A one or two line explanation should be adequate).
b) Find the probability that Xn is a recordtodate, as a function of n ≥ 1. Again use
symmetry.
c) Find a simple expression for the expected number of recordstodate that occur over
the ﬁrst m trials for any given integer m. Hint: Use indicator functions. Show that this
expected number is inﬁnite in the limit m → 1.
Exercise 1.13. (Continuation of Exercise 1.12)
a) Let N1 be the index of the ﬁrst recordtodate in the sequence. Find Pr {N1 > n} for
each n ≥ 2. Hint: There is a far simpler way to do this than working from part b in Exercise
1.12.
b) Show that N1 is a rv.
c) Show that E [N1 ] = 1.
d) Let N2 be the index of the second recordtodate in the sequence. Show that N2 is a rv.
Hint: You need not ﬁnd the distribution function of N2 here.
e) Contrast your result in part c to the result from part c of Exercise 1.12 saying that the
expected number of recordstodate is inﬁnite over an an inﬁnite number of trials. Note:
this should be a shock to your intuition — there is an inﬁnite expected wait for the ﬁrst of
an inﬁnite sequence of occurrences. 1.8. EXERCISES 53 Exercise 1.14. (Another direction from Exercise 1.12)
a) For any given n ≥ 2, ﬁnd the probability that Nn and Xn+1 are both recordstodate.
Hint: The idea in part b of 1.12 is helpful here, but the result is not.
b) Is the event that Xn is a recordtodate statistically independent of the event that Xn+1
is a recordtodate?
c) Find the expected number of adjacent pairs of recordstodate over the sequence X1 , X2 , . . . .
1
1
Hint: A helpful fact here is that n(n1 = n − n+1 .
+1)
Exercise 1.15. a) Assume that X is a discrete rv taking on values a1 , a2 , . . . , and let
Y = g (X ). Let bi = g (ai ), i≥1 be the ith value taken on by Y . Show that E [Y ] =
P
P
i bi pY (bi ) =
i g (ai )pX (ai ). b) Let X be a continuous rv with density fXR x) and let g be diﬀerentiable and monotonic
(
R
increasing. Show that E [Y ] = y fY (y )dy = g (x)fX (x)dx.
Exercise 1.16. a) Consider a positive, integervalued rv whose distribution function is
given at integer values by
FY (y ) = 1 − 2
(y + 1)(y + 2) for integer y ≥ 0 Use (1.24) to show that E [Y ] = 2. Hint: Note the PMF given in (1.21).
b) Find the PMF of Y and use it to check the value of E [Y ].
c) Let X be another positive, integervalued rv. Assume its conditional PMF is given by
pX Y (xy ) = 1
y for 1 ≤ x ≤ y Find E [X  Y = y ] and show that E [X ] = 3/2. Explore ﬁnding pX (x) until you are convinced that using the conditional expectation to calculate E [X ] is considerably easier than
using pX (x).
d) Let Z be another integervalued rv with the conditional PMF
pZ Y (z y ) = 1
y2 for 1 ≤ z ≤ y 2 Find E [Z  Y = y ] for each integer y ≥ 1 and ﬁnd E [Z ].
Exercise 1.17...
View Full
Document
 Spring '09
 R.Srikant

Click to edit the document details