Homework 1 Solutions
1. If Fi are -algebras for each i I then iI Fi is a -algebra. (recall that
this was the property that allowed us to dene (A)
2. If F and G are -algebreas, F G is not necessarily a -algebra.
1. The intersectio
Theory of Probability I: Test II, Solutions
Exercise 1.1. A Poisson random variable Poisson() is distributed as
P[N = n] = e
, n = 0, 1, 2, . . .
1. compute the characteristic function of N
2. show that the sum of two independent Poisson random var
Homework 2, Solutions
Exercise 2.1. Let X be random variable on a probability space (, F , P). Show that if Y
is a random variable which is measurable with respect to (X ) then there exists a (Borel)
measurable function f : R R such that Y = f (X ).
Homework 3, Solutions
1. Suppose that X is a random variable with density function f , and P(
X ) = 1. Let g : (, ) R be a strictly increasing and dierentiable function.
Compute the density of g (X ) in terms of f and g .
2. compute the d
Solutions, Homework 4
Exercise 4.1. Let X1 , X2 , . . . be i.i.d. with P(Xi = (1)k k ) = C/k 2 log k for k 2 where
C is chosen to make the sum of all probabilities one. Show that E[|Xi |] = but there is a
nite constant such that
in probability as n
Solutions, Homework 5
Exercise 5.1. Consider the space L0 (, F , P) of measurable random variables (identied up
to a.s. equality). Denote by
d(X, Y ) = E
|X Y |
1 + |X Y |
1. d(X, Y ) is a metric on L0 (which means d(X, Y ) 0, d(X, Y ) = 0 i
Solutions, Homework 6
Exercise 6.1. If Fn converges weakly to F and F is continuous, then supx |Fn (x) F (x)| 0.
Solutions: the solution is based on using the uniform continuity of the limit on compact
intervals. Let > 0. There exists M such that F (M )
Solutions, Homework 7
Exercise 7.1. Show that if Xn X in probability than Xn = X . Conversely, if Xn = c
where c is a constant, then Xn c in probability.
Solutions: We have already seen that, if Xn X and g is bounded and continous, we have
that E[g (X
Solutions, Homework 8
1. (conditional variance) Dene
V ar(X |G ) = E[X 2 |G ] (E[X |G ])2 .
V ar(X ) = E[V ar(X |G )] + V ar(E[X |G ]).
2. Show that if X and Y are random variables with E[X |G ] = Y and E[X 2 ] = E[Y 2 ],
1. Let (Xn )n an adapted process and N a stopping time. Show that
2. Let N M two stopping times and A FN . Show that the random time
L = 1A N + 1Ac M,
is a stopping time.
1. cfw_XN A = k (cfw_N = k cfw_Xk A). No
Theory of Probability I: Test I, Solutions
Exercise 1.1. (30 points) Consider X a square integrable random variable, i.e. E[X 2 ] < .
Denote by its expectation
= E[X ]
and by its standard deviation
E[(X E[X ])2 ].
Show that, for n > 0 we have
P X ( n
Course: M385C/CSE384K - Theory of Probability I
Page: 1 of 4
University of Texas at Austin
HW Assignment 3
Problem 3.1. Let (S, S , ) be a measure space, and suppose that f L1 . Show that for each > 0 there
exists > 0 such that if A S and ( A) < , t