W

\
c )(
X
tq

A
+ p At,
>
I
"J
C
4
'2
2
3 )(
.i24_.
'q'
fii
)(
:s
)A

r
.
G&w
yGL
4 % dm+
a
%T
J
"
1
5
I
I(&
QQ
y
c \,z


t\ f
5
yxt
,
f ).A2
/.
GD
LG 9
,/> /u
L cver Cit/1U!) 3
I;
9 (KY) g Gg&A)EX1Ey
Vt V1 \DC/ V\A§W}if
J
) CW2
, C7 0 M i J \ V1 »
L r L U (all K xx]. C; ,/ 1.2
'2, \
'\ 2S. ml 6:
()1! LN) :2 m, ,w _ 7 5/ J (1/ Z
2 m / Q
7 V J \ l
Q Q r (L 7;
<7  / J
: ':. C 2 C
\
Stochastic Integrals.
In dening the integral
T
I=
f (s)dx(s)
0
the Lebesgue theory assumes that f () is a bounded continuous function and x(s) is a
function of bounded variation so that dx can be thought of as a measure on [0, T ]. With
that we get the bo
11. Dynamic Programming.
If x(t) =
t
0
c(s, )d (s) and c(s, ) C , then it is not hard to see that
E [x(t)]2] CT
But actually if (x) is any convex function
E [(x(t)]
1
2Ct
y2
(y )e 2Ct dy
The solution u(s, x) of
us +
C
; u(t, x) = (x)
2 xx
is given by t
Section 12. Markov Chain Approximations
It is often necessary to approximate models in continuous time by discrete versions. The
simplest example is approximation of Brownian motion by random walks. Let us consider
1
independent random variables Xi = 1 wi
Section 10. Connections with PDE.
We have a progressively measurable stochastic process x(t, ) on (, Ft , P ) such that
the paths are continuous with probability 1. We have bounded progressively measurable
functions b(t, ) and a(t, ) with a(t, ) 0. Moreov
9. Diusion proceses.
A diusion process is a Markov process with continuous paths with values in some Rd .
Given the past history up to time s the conditional distribution at a future time t is given
by the transition probability p(s, x, t, dy ). The cfw_p
8. Stochastic Dierential Equations.
Brownian motion has the property that the distribution of x(t + h) x(t) given the eld Ft of information up to time t, is Gaussian with mean 0 and variance h. One can
visualize a Markov process x(t) for which the corresp
7. Brownian Motion as a Markov process.
As a process with independent increments given Fs , x(t) x(s) is independent and has a
normal distribution with mean 0 and variance t s. Therefore
P [x(t) AFs ] =
p(t s, x(s), y )dyy
A
where
p(t, xy ) =
(y x)2
1
e
Stochastic Calculus Fall 2009: Homework 5
1. Mean reversion. Consider the daily timeseries of dividend/splitadjusted closing prices for Exxon
Mobile (XOM), and Chevron Texaco (CVX), from Yahoo!Finance. The respective prices are denoted by
Xt and Yt . We
Stochastic Calculus Fall 2009
Homework 4
1. First passage time of Brownian motion. Let W (t) be a standard Brownian motion starting at zero.
Derive the density of the rst passage time a of the level W = a. Verify that E (a ) = . [Hint: reection
principle.
Stochastic Calculus Fall 2009
Homework 3
1. Multivariate models. Download endofday data for the components of the Dow Jones Industrials
Average for the last year. Using this data, specify a model for the joint dynamics of the component stocks
of the typ
Stochastic Calculus Fall 2009
Homework 2
1. Conditional Expectation Let (X, Y ) be two random variables such that
fXY (x, y )
=
=
1
if x2 + y 2 1
0 if x2 + y 2 > 1
(1)
Compute E (X Y = y ), E (X 2 + Y 2 Y = y ), E (X + Y Y = y ) and the probability den
The TA for the course is Mr Vidal Alcala <alcala@cims.nyu.edu>
He will run the problem session on Thursdays 5.30 to 6.30 in room 813.
The rst problem session will be next week Sept 11, 2008. The problem
session will not meet this Thursday (Sept 4, 2008).
Firstpassage time of BM from a strip
Let X (t) be a standard Wiener process (Brownian motion with variance 1
and drift 0). We consider the strip cfw_(x, t) : (x, t) (a, b) (0, ) and ask
for the probability that BM does not exit the strip in time T . More
Final Examination.
Due Dec 15.
1. We have a population of size N which completely renews itself every generation. The
total size is N in every generation. The population consists of two types of individuals
A and B . If in a given generation the populatio
Assignment 9.
1. Let us consider independent random variables Xi = 1 with probability
1
2
each.
Sn = X1 + X2 + + Xn
and
j
Sj
n ( ) =
n
n
n is interpolated linearly between
j
n
and
j +1
.
n
Prove the estimate
E [  n ( t) n ( s )  4 ] C  t s  2
for the
Assignment 8.
1. Solve the equation
ut +
2 x2
uxx + rxux ru = 0 for
2
with
u(T, x) = (x c)+ =
xc
0
stT
if x c
otherwise
explicitly in terms of the tails of a normal distribution.
2. FeynmanKac formula. If V is a bounded function and (t) is Brownian moti
Assignment 7.
1. (t) is Brownian motion. x(t) = arctan (t). Is x(t) a Markov process? Does it satisfy
a stochastic dierential equation with respect to the Brownian motion (t). Does it satisfy
dx(t) = (x(t)d (t) + b(x(t)dt
for some and b. If so write it do
Assignment 6.
1. Assume a family ph (dy ) of probability distributions on Rd satisfy
lim
h0
1
h
1
h0 h
lim
and for some > 0
lim
h0
yi ph (dy ) = bi
yi yj ph (dy ) = ai,j
1
h
y
2+
ph (dy ) = 0
Conclude that for any smooth function f (y )
1
h0 h
lim
[f (y )