Upload
tu-shirota
View
223
Download
0
Embed Size (px)
Citation preview
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
1/156
Stochastic Differential Equations.
Introduction to Stochastic Models for Pollutants
Dispersion, Epidemic and Finance
15th March-April 19th, 2011at Lappeenranta University of Technology(LUT)-Finland
By Dr. W.M. Charles: University of Dar-Es-salaam-Tanzania
andDr J.A.M. van der Weide:
Delft University of Technology, The Netherlands
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
2/156
2
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
3/156
Contents
1 Introduction 51.1 Objectives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.2 Stochastic modelling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.2.1 Probability Models . . . . . . . . . . . . . . . . . . . . . . . . . . . 51.2.2 Definitions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2 Conditional Probability and Expectation 132.0.3 More Properties of Conditional Expectation . . . . . . . . . . . . . 17
2.1 Stochastic Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.2 The Gaussian Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 182.3 Wiener Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
2.3.1 Random walk Construction . . . . . . . . . . . . . . . . . . . . . . 242.3.2 Diffusion processes . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
3 Stochastic Integrals 31
4 Ito Integral Process 414.1 Motivation and problem formulation . . . . . . . . . . . . . . . . . . . . . 444.2 Stochastic Differential Equations . . . . . . . . . . . . . . . . . . . . . . . 494.3 Linear Stochastic Differential Equations . . . . . . . . . . . . . . . . . . . 50
5 Itos Formula 535.1 The Multidimensional Ito Formula . . . . . . . . . . . . . . . . . . . . . . 595.2 Applications of Ito formula . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
5.2.1 Examples of Linear SDEs with additive noise . . . . . . . . . . . . 615.2.2 Examples of Linear SDEs with multiplicative noise . . . . . . . . . 635.2.3 Relation between Ito and Stratonovich SDEs . . . . . . . . . . . . . 67
6 Connection between Stochastic differential and PDES 756.1 Markov processes and Transition Density . . . . . . . . . . . . . . . . . . . 756.2 Transition Density Estimation . . . . . . . . . . . . . . . . . . . . . . . . . 756.3 Forward density estimation . . . . . . . . . . . . . . . . . . . . . . . . . . . 766.4 The forward-reverse formulation . . . . . . . . . . . . . . . . . . . . . . . . 77
3
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
4/156
4 CONTENTS
6.5 The Generator of the Ito Diffusion . . . . . . . . . . . . . . . . . . . . . . 776.6 Kolmogorov Backward equation (KBE) . . . . . . . . . . . . . . . . . . . . 816.7 Feynman-Kac representation formula . . . . . . . . . . . . . . . . . . . . . 82
6.8 1-dimensional Fokker Planck equation(FPE) . . . . . . . . . . . . . . . . . 866.8.1 d- dimensional Fokker Planck equation(FPE)) . . . . . . . . . . . . 87
6.9 Definition of order of convergence of Numerical scheme . . . . . . . . . . . 896.10 Derivation of Numerical schemes for SDEs . . . . . . . . . . . . . . . . . . 90
6.10.1 Stochastic Taylor expansion and derivation of stochastic numericalschemes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
6.10.2 Numerical schemes . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
7 Application of SDEs 1017.1 Introduction to particle models and their application to model transport in
shallow water . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1017.2 Diffusion and dispersion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
7.2.1 Molecular diffusion . . . . . . . . . . . . . . . . . . . . . . . . . . . 1017.3 Molecular diffusion with a constant diffusion coefficient . . . . . . . . . . . 1027.4 Molecular diffusion with a space varying diffusion coefficient . . . . . . . . 1057.5 Advection-diffusion process for a two dimensional model . . . . . . . . . . 1067.6 Consistence of particle model with the ADEs . . . . . . . . . . . . . . . . . 1077.7 Introduction of SDEs to Model the Dynamics of Electricity Oil Spot Price 111
8 Application of SDEs to Finance 1138.1 Feynman-Kac representation formula . . . . . . . . . . . . . . . . . . . . . 113
8.2 Financial Markets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1158.3 The One-Period Binomial Model . . . . . . . . . . . . . . . . . . . . . . . . 1178.4 The Discrete Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1208.5 The Multi-Period Binomial Model . . . . . . . . . . . . . . . . . . . . . . . 1238.6 The Financial Market and The Black-Scholes Model . . . . . . . . . . . . 1268.7 The Black-Scholes Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1338.8 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1418.9 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1518.10 Appendix II . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1528.11 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 156
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
5/156
Chapter 1
Introduction
This in an introduction to the theory of Stochastic differential equations(SDEs) for thosewho wish to model the dynamics of systems in Chemistry, Biology, Finance, Economicsand populations to mention but a few. It assumes that the learner has some backgroundin statistics and probability theory. But it starts with the definitions of some importantconcepts that a learner encounters in this course but they are not deeply discussed in thiscourse.
1.1 Objectives
1. The course intends to provide an understanding of modeling problems related to
stochastic differential equations.2. To introduce practical skills and solutions methods to the learner which includes
numerical as well as analytical.
3. To describe areas of application such as dispersion of pollutants in shallow water
1.2 Stochastic modelling
There are a number of literatures in the form of textbooks that provides full details for thebackground of probability theory and Stochastic calculus for example see Arnold (1974),ksendal (2003), Gihman (1972), Kloeden (1999). The main definitions discussed in thischapter are taken from the above textbooks.
1.2.1 Probability Models
Stochastic calculus is concerned with the study of stochastic processes, which models theuncertainties. Probability models can be used to model uncertainty. The basic object in aprobability model is a probability space, which is a triple (, F, P) consisting of a set ,
5
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
6/156
6 CHAPTER 1. INTRODUCTION
usually denoted as the sample space, a -field Fof subsets of and a probability P definedon F. The set can be considered as the set of all possible scenarios that can occur. Toany event we can associated the subset A
consisting of all scenarios at which the event
occurs. Such a subset will also be denoted as an event and Fis the collection of all events.From a mathematical point of view, it is important to consider only collections of eventsthat have the structure of a -field.
1.2.2 Definitions
Definition 1 A collection Fof subsets of a set is called a -field if1. F;2. if A
F, then Ac =
\A
F;
3. if (An) is a sequence in F, thenn=1An F.
A measurable space is a pair (, F), where is a set andFa -field of subsets of .
As an example, the collection P() of all subsets of is a -field.
Definition 2 A probability P defined on a -fieldFis a map fromFto the interval [0, 1]such that
1. P() = 1;
2. P(n=1An) =
n=1 P(An) for any pairwise disjoint sequence (An) F. Pairwise
disjoint means that Ai Aj = for i = j.
Definition 3 A random variableA random variable is a real functionX(), ( is measurable with respect to a probabilitymeasure P. That X : R
Definition 4 Distribution functionThe probabilistic behaviour of X() is completely and uniquely specified by the Distribu-tion function F(x) = P(
{
: X() < x
}.
Definition 5 Continuous random variableX() is continuous random variable if there exist f(x) (the density function) such thatf(x) 0, f(x)dx = 1, F(x) = x f(u)du.By the way, random variables can have different distribution functions, for example Poisson,Exponential, or Gaussian and so on. They can take widely varying values.The momentsof a random variable defines various characteristics of its distribution.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
7/156
1.2. STOCHASTIC MODELLING 7
Definition 6 Expectation (mean) of a random variableIf X is a random variable defined on the probability space (, F, P), then the expectedvalues or the mean of X isE(X) = XdP.This is the average ofX over the entire probability space. For a random variable continuousover
E(X) =
xf(x)dx
Definition 7 VarianceVariance is a measure of the spread of data about the mean
Var(X) = E((X )2
) = E(X)2
2
The standard deviation is =
Var(X).
Definition 8 The kth -order momentThe kth -order moment of a continuous random variable is defined by
K = E(Xk) =
xkf(x)dx
The expectations satisfy various properties such as that of linearity and so forth seeksendal (2003),Jazwinski 1970, for example.
Definition 9 Gaussian random variableA random variable X is Gaussian random variable if its has the Gaussian (or Normal)density function is given by
f(x) =1
2exp
(x )222
,
The function f(x) is bell shaped if x = and stretched or compressed according to themagnitude of 2 see Figure 1.1 (a)- (b) and the maximum value is at 1
2
see
where is the mean and 2 is the variance of the normal Distribution N(, 2). When = 0 and = 1, the distribution N(0, 1). is known as the standard Gaussian distribution.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
8/156
8 CHAPTER 1. INTRODUCTION
4 3 2 1 0 1 2 3 40
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
x
p(x)
Gaussian density function with=1
4 3 2 1 0 1 2 3 40
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
x
p(x)
Gaussian density function with=0.5
(a) = 0, = 1 x = 4 : 4 (b) = 0 and = 1 x = 4 : 4Figure 1.1: (a) The maximum value is at is at p(0) = 0.3989 (b) the maximum value is atis at p(0) = 0.7979
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
9/156
1.2. STOCHASTIC MODELLING 9
Definition 10 CovarianceThe covariance of two random variables X and Y is defined to be
Cov(X, Y) = E((X 1)(Y 2)) = E(XY) E(X)E(Y)where 1 = E(X) and 2 = E(Y).
Let us now consider the convergence of random variables. Let X and Xn, n = 1, 2, . . . bereal-valued random variables defined on a probability space (, F, P). The distributionfunctions of X and Xn are F and Fn respectively. The convergence of the sequence Xn toX has various definitions depending on the way in which the difference between Xn andX is measured.Let us look at the following definitions
Definition 11 Convergence with probability one (w.p.1), that is a.sA sequence of random variable{Xn()} converges with probability one to {X()} if
P{ : lim
nXn() = X()}
= 1
This is also called almost sure convergence.
Definition 12 Convergence in mean square senseA sequence of random variable {Xn()} such thatE(X2n()) < for all n converges inmean square to {X()} if
limn
E|Xn X|2 = 0
Definition 13 Convergence in distributionA sequence of random variable {Xn()} converges in probability (or stochastically) to X if
limn
Fn(x) = F(x), x R.
Definition 14 Convergence in probabilityA sequence of random variable {Xn()} converges in probability (or stochastically) to X if
limn P({ : |Xn() X()) ) = 0
Definition 15 Stochastic processesA stochastic process is a family of random variables X(t, ) of two variables t T, on a common probability space (, F, P) which assumes real values and is P-measurableas a function of for a fixed t. The parameter t is interpreted as time, with T being atime interval X(t, ) represents a random variable on the above probability space , whileX(, ) is called a sample path or trajectory of the stochastic process.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
10/156
10 CHAPTER 1. INTRODUCTION
Definition 16 Stationary processA stochastic process X(t) such thatE(|X(t)|2) < , t T is said to be stationary if itsdistribution is invariant under time displacements:
FX1,X2,Xn(t1 + h, t2 + h, tn + h) = FX1,X2,Xn(t1, t2, tn).
That is all finite dimensional distributions ofX are invariant under an arbitrary time shift.If X is a stationary, then the finite dimensional distributions of X depend on only the lagbetween the times {t1, . . . tn} rather than their values. In other words, the distribution ofX(t) is the same for all t T
Definition 17 A continuous -time stochastic processX = {X(t), t 0} is called a Markovprocess if it satisfies the Markov property, i.e.,
P r (X(tn+1 B | X(t1) = x1, . . . , X (tn) = xn) = P r (X(tn+1 B | X(tn) = xn)
That is, the future behaviour of the process depends on the past only through the currentprocess. For all Borel subsets B of , time instances 0 < t1 < t2 . . . , tn < tn+1 and allstates x1, x2, . . . , xn for which the conditional probabilities are defined.
Let X be a Markov process and write its transition probabilities as
P (s, x; t, B) = P r (X(t) B | X(s) = x) , 0 s < t
if the probability distribution P r is discrete, the transition probabilities are uniquely de-termined by the transition matrix with components
P (s, i; t, j) = P r (X(t) = xj | X(s) = xi)
That is the probability of moving from state i at time s to state j at time t, states cansimply be taken as values of a random variable X. For continuous case we have;
P (s, x; t, B) =
Bf(s, x; t, y) dy
for all B B, where the density f(s, x; t, ) is called the transition density. A Markovprocess is said to be homogeneous if all its transition probabilities Markov processes dependonly on the time difference t s rather than on specific values of s and t.Definition 18 A stochastic process X is called an{Ft}-martingale if the following condi-tions hold
1. X is adapted to the filtration {Ft}t0
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
11/156
1.2. STOCHASTIC MODELLING 11
2. For all tE [| X(t) |] <
3. For all s and t with s t the following relationE [| X(t) | Fs |] = X(s), 0 s t
{Ft}t0Note: ifE [| X(t) | Fs |] X(s), 0 s t thenX is said to be{Ft}-supermartingale whileifE [| X(t) | Fs |] X(s), then X is said to be {Ft}-submartingale.The condition (1) says that we can observe the value X(t) at time t, and condition (2) isa technical condition (to mean integrability), The really important condition is the third.It means that the expectation (estimation) of a future value of Xt, given the informationavailable today Fs, equals todays observed value Xs, i,e t s.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
12/156
12 CHAPTER 1. INTRODUCTION
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
13/156
Chapter 2
Conditional Probability andExpectation
In this section we will give a review of conditioning, conditional probability and conditionalexpectation.From first courses in Statistics we know the definition of the conditional probability of theevent B given the occurrence of the event A:
P(B | A) = P(B A)P(A)
.
Here it is required that P(A) > 0. So, ifX is a random variable with a probability densityf, i.e.
P(a X b) =ba
f(x)dx,
the definition of the conditional probability cannot applied if we condition on the eventA = {X = a}. In first courses in Statistics one usually defines this conditional probabilityby using a limit argument as follows. Consider a pair of random variables (X, Y) with
joint probability density fX,Y, i.e.
P((X, Y) G) =
G
fX,Y(u, v) dudv.
It follows that
P(Y b | a X a + ) =b
a+a fX,Y(u, v) dudva+
a
fX,Y(u, v) dudv
.
Assuming that the joint density is smooth, we can calculate the limit as 0 :
lim0
P(Y b | a X a + ) =b
fX,Y(a, v)
fX(a)dv
13
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
14/156
14 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
where
fX(a) =
fX,Y(a, v) dv
denotes the (marginal) density of the random variable X. The function
fY|X=a(v) =fX,Y(a, v)
fX(a)
is a probability density and it is called the conditional density of Y given X = a. Usingthis density, the conditional expectation of Y given X = a is defined as
E(Y | X = a) =
vfY|X=a(v) dv =
vfX,Y(a, v)
fX(a)dv.
So, we define E(Y | X) as a random variable. The value it takes depends on the value ofX :
E(Y | X) = E(Y | X = a) on {X = a}.
It follows that for bounded Borel functions
E[(X)E(Y | X)] =
(a)E(Y | X = a)fX(a) da
=
(a)
v
fX,Y(a, v)
fX(a)dvfX(a) da
=
(a)vfX,Y(a, v) dv da
= E[(X)Y].
It is this property, that we will use as the defining property of the conditional expectation ina more general set-up where we dont have to assume the existence of probability densities.Let (, F, P) be a probability space and let G F be a sub--algebra. Let X be anintegrable random variable, i.e. E|X| < . Define the set function Q on G as follows
Q(G) = G X dP.It follows that Q is a measure on (, G) which is absolutely continuous with respect therestriction ofP to G : P(G) = 0 = Q(G) = 0. The Radon-Nikodym Theorem implies theexistence of a density of Q with respect to P, i.e. a G-measurable random variable Y suchthat
Q(G) =
G
Y dP,
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
15/156
15
or more general
Z dQ = ZY dP,for any nonnegative, G-measurable random variable Z. The density Y is unique moduloP-null-sets and is sometimes denoted as a derivative: Y = dQdP.
Definition 19 The conditional expectation E(X | G) of X given G is defined as the G-measurable random variable satisfying the relation
G
X dP =
G
E(X | G) dP,
for any G G.To see what this means we consider the special case where G = {, A , Ac, } and X = 1B.It follows from the definition that
E(1B | G) = P(B | A)1A + P(B | Ac)1Ac .More general, let G be a finite sub--algebra. Then, there exists a (unique) G-measurablepartition A = {A1, . . . , An} such that every element of G can be represented as a unionof partition elements. Every G-measurable random variable Y is constant on the partitionelements and can be represented as
Y =n
i=1
yi1Ai .
So, in particular, there exist real numbers xi such that
E(X | G) =ni=1
xi1Ai .
Now, Aj
X dP =
Aj
E(X | G) dP =Aj
n
i=1xi1Ai dP = xjP(Aj),
hence
xj =1
P(Aj)
Aj
X dP =: E(X | Aj),
and
E(X | G) =ni=1
E(X | Ai)1Ai .
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
16/156
16 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
IfG and X are independent, then for any G G,
G X dP = E(X1G) = E(X)E(1G) = GE(X) dP,so E(X | G) = E(X).In the next Theorem we present a number of properties of the conditional expectation.
Theorem 1 (a) IfG = {, }, thenE(X | G) = E(X),(b) if Z is bounded and G-measurable, then
E(XZ | G) = ZE(X | G),
(c) if
G1
G2 then
E(E(X | G2) | G1) = E(X | G1),(d) if g is a convex function on the range of X, then
g(E(X | G)) E(g(X) | G),
(e) if 0 Xn, and Xn X, thenE(Xn | G) E(X | G),
(e) if 0 Xn, thenE(lim infn Xn | G) liminfn E(Xn | G),
(f) if limn Xn = X almost surely and |Xn| Y withE(Y) < , thenlimn
E(Xn | G) = E(X | G).
The conditional expectation E(X | G ) can be considered as a projection of the randomvariable X on the space ofG-measurable random variables as follows. Let X L2(, F, P),the vector space of (equivalence) classes of square integrable random variables. With theinner product
X, Y
= E(XY),
L2(, F, P) is a Hilbert space. It follows from Theorem 1(d) thatE(|E(X | G)|2) E(E(X2 | G)) = E(X2) < .
Hence, the map
X L2(, F, P) E(X | G) L2(, G, P)is a linear contraction. It is the orthogonal projection of L2(, F, P) on L2(, G, P).
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
17/156
17
2.0.3 More Properties of Conditional Expectation
Definition 20 Condition Expectation
Let(, F, P) be a probability space and let G be a sub--algebra of F. Let X be a randomvariable on (, F, P) . ThenE [X | G] is defined to be any random variable Y that satisfies(a) Y is G -measurable.(b) For everyA G we have the partial averaging property
A
Y dP =
A
XdP
1. (Role of independence property): If a random variable X is independent of a -algebra H, then
E [X | H] = E[X] (2.1)
The point of this statement is that if X is independent ofH, then the best estimateof X based on the information in H is E[X], the same as the best estimate of Xbased on no information.
2. (Measurable property): If a random variable X is G -measurable, thenE [X | G] = X (2.2)
The point of this statement is that if the information content of G is sufficient todetermine X, then the best estimate of X based on information G is X itself.
3. (Tower property): If H is a sub -- algebra ofG then
E [E (X | G) | H] = E [X | H] (2.3)
The point of this statement is that ifH is a sub -- algebra ofG mean that G containsmore information than H. If we estimate X based on the information in G, and thenestimate the estimator based on the small amount of information in H, then we getthe same results as if we had estimated X directly based on the information in H.
4. (Taking out what is known): If Z is G-measurable, then
E [ZX | G] = Z E [X | G] (2.4)
The point of this statement is that when conditioning on G, the G -measurable randomvariable Z acts like a constant. So we take it out of the expectation sign.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
18/156
18 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
2.1 Stochastic Processes
Let (,
F, P) be a probability space. A collection X = (Xt : 0
t
T) of random
variables on (, F, P) is called a stochastic process. The variable t is usually consideredas a time parameter and Xt denotes the state of a system at time t. For a fixed state ofthe world , the function
t [0, ) Xt()
is called a sample path. The probability distribution of the process is a probability mea-sures on the function space R[0,T] of all sample paths and it is determined by the finite-dimensional distributions, i.e. the probabilities
P(X(t1)
B1, . . . , X (tn)
Bn)
for any finite sequence 0 t1 tn T and Borel sets B1, . . . , Bn. Two stochasticprocesses X and Y with the same finite-dimensional distributions are identified and we saythat X and Y are versions (or modifications) of one another. Under certain conditions,we can show the existence of a version with sample paths satisfying certain regularityproperties. For example, if there exist > 0 and > 0, so that for any 0 u t T,
E(|Xt Xu|) C(t u)1+, (2.5)
for some constant C, then there exists a version of X with continuous sample paths.The -algebra
Ft = (Xu, 0 u t), 0 t T,
denotes the information available at time t to an observer of the process X. The increasingcollection of -algebras (Ft)0tT is called a filtration. A stochastic process Y = (Yt :0 t T) defined on the same probability space (, F, P) is said to be adapted to thefiltration (Ft)0tT if , for any t, Yt is Ft-measurable.An (Ft)-adapted process Y is called a martingale, if, for any t, Yt is integrable and, for any0 s t T,
E(Yt
| Fs) = Ys.
2.2 The Gaussian Distribution
A random variable U has a standard normal distribution if its probability density is givenby
f(u) =12
e12u2.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
19/156
2.2. THE GAUSSIAN DISTRIBUTION 19
The characteristic function of a standard normal random variable is given by
(t) = EeitU = e12t2, t
R.
It follows that
E[U2k+1] = 0 and E[U2k] =(2k)!
k!2k.
A random variable X has a Gaussian distribution with mean and variance 2 if
X = + U,
where U is a standard normal random variable and > 0. A standard normal randomvariable is Gaussian with mean 0 and variance 1.
The probability density of X is given by
f(x) =12
e12(
x )
2
.
The characteristic function of a Gaussian distribution with mean and variance 2 is givenby
(t) = eit122t2, t R.
A random n-vector is a measurable mapping defined on some probability space (, F, P)taking values in a finite-dimensional vector space Rn. A random vector X can be repre-sented as a column vector
X =
X1...
Xn
,
where the components Xi are random variables. If the components Xi have finite firstmoments, then the mean of the random vector is the column vector
= 1.
..n =
E[X1].
..E[Xn]
.If the second moments of the components are also finite, then the covariance matrix ofthe random vector is the n n-matrix = (ij)1i,jn where ij = Cov(Xi, Xj). Notethat a covariance matrix is a symmetric matrix, i.e. ij = ji . The covariance matrix of anon-degenerate random vector is non-negative definite:
a, a = Var(a, X) > 0
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
20/156
20 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
for all a Rn \ {0}.A random n-vector X has a (non-degenerate) n-dimensional Gaussian distribution withmean vector and covariance matrix if there exists a n
n-matrix A such that det(A)
= 0
and
X = + AU,
where U is a random n-vector with independent standard normal components. It follows,as in the case of Gaussian random variables, that the matrix A can be considered as thesquare root of the covariance matrix : = AAT. The characteristic function of then-dimensional Gaussian distribution is given by
(t) = eit,12t,t, t Rn.
Let B be a k n matrix, then the characteristic function of the random k-vector BX isBX(t) = e
it,B 12t,BBTt, t Rk,
and we see that BX is a Gaussian k-vector with mean B and covariance matric BBT.The joint density of a non-degenerate Gaussian n-vector with mean and covariance matrix is given by
f(x1, . . . , xn) =
12
ndet(1)e
12x,1(x).
2.3 Wiener Process
The concept of Wiener process is needed in order to model the Gaussian disturbances.A stochastic process W = (W(t) : t 0) defined on some filtered probability space(, F, (Ft)t0, P) is a Wiener process if
1. W is adapted to the filtration (Ft)t0,2. W(0) =0,
3. The process W has independent increments if r < s
u
t then W(t)
W(u) and
W(s) W(r) are independent variables.4. W has independent increments, i.e. if 0 s t then W(t) W(s) is independent of
Fs,5. For s < t, the stochastic process W(t) W(s) had Gaussian distribution with
N[0,
t s]6. The trajectories (paths) t [0, ) W(t, ) of W are continuous functions.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
21/156
2.3. WIENER PROCESS 21
It follows immediately that the Wiener process is a martingale with respect to the filtration(Ft):
E(W(t) | Fs) = W(s),for 0 s t.For any finite sequence 0 < t1 < < tn, the increments W(tn) W(tn1), . . . , W (t2) W(t1), W(t1) are independent, Gaussian random variables, hence
W(t1)W(t2)
...W(tn)
=
t1 0 . . . 0t1
t2 t1 . . . 0
......
...t1
t2 t1 . . . tn tn1
U1U2...
Un
,
where Ui = (W(ti)
W(ti1)/
ti
ti1, i = 1, . . . , n . It follows that the n-vector
W(t1)W(t2)
...W(tn)
is Gaussian with mean 0 and covariance matrix
=
t1 t1 t1 . . . t1t1 t2 t2 . . . t2t1 t2 t3 . . . t3
... ... ... ...t1 t2 t3 . . . tn
.
The Wiener process is called a Gaussian process, since all finite-dimensional distributionsare Gaussian. Since
E(|W(t) W(u)|4) = 3(t u)2,the existence of a continuous version follows from formula (2.5) in section 2.1 with = 4 and = 2. However, almost all sample paths of the Wiener process are nowhere-differentiable.We will not give a rigorous proof here, but note that (W(t + h) W(t))/h is standardnormal for every value of h > 0. So if we consider the ratio (W(t + h)
W(t))/h and let
h tend to 0, we see that the variance of this ratio will become arbitrarily large, and so wewould never expect the existence of a limit of the ratio for every , which would have tobe the case to have a time derivative of Wt().We now consider a further property of the sample paths of the Wiener process. Define forany t > 0 the p-variation of the sample path by
Vp(t) = limn
2ni=1
|W(tni ) W(tni1)|p,
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
22/156
22 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
where tni =i2n
t, i = 0, 1, . . . , 2n is a partition of [0, t]. Then
Vp(t) = if 1
p < 2
t if p = 20 if p > 2.
To see this, define
Sn =2ni=1
|W(tni ) W(tni1)|2.
Since
E[Sn] =
2n
i=1
E(|W(t
n
i ) W(tn
i1)|2
) = 2n t
2n = t
and
Var[Sn] =2ni=1
V ar(|W(tni ) W(tni1)|2) = 2nt2
22n2 =
t2
2n1,
it follows by monotone convergence,
E
n=1(Sn t)2 = limNE
N
n=1(Sn t)2 = limN
N
n=1 V ar(Sn) = 2t2 2.Since V1 = , we cannot define Stieltjes integration with respect to a path of the Wienerprocess. To understand the problem, we will consider the integralt
0
W(s)dW(s).
Let ni [tni1, tni ). Define the Riemann sums
Rn =2n
i=1W(ni )(W(t
ni ) W(tni1)).
To study the limit behaviour of Rn, we write Rn as sums of terms that are squares ofincrements or products of increments over disjoint intervals as follows:
Rn =1
2W2(t) 1
2W2(0) 1
2
2ni=1
(W(tni ) W(tni1))2
+2ni=1
(W(ni ) W(tni1))(W(tni ) W(ni )) +2ni=1
(W(ni ) W(tni1))2.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
24/156
24 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
Since
1. 122n
i=1(W(tni ) W(tni1))2 V2(t) = t,
2. put Tn =2ni=1(W(
ni ) W(tni1))(W(tni ) W(ni )), then
E(T2n) =2ni=1
(ni tni1)(tni ni ) t2
22n,
hence E(n T2n) < and Tn 0 a.s..
3. put Un =2ni=1(W(
ni ) W(tni1))2, then
E(Un) =2n
i=1
(ni
tni1) and Var(Un) = 2
2n
i=1
(ni
tni1)2
2t222n.
It follows that E (n(Un (Un))2) < and Un (Un) 0, a.s.
If we choose
ni = (1 )tni1 + tni , 0 1,it follows that the Riemann sums converge:
limn
Rn =1
2W2(t) + ( 1
2)t.
So the integralt0
W(s)dW(s) depends on the point in which we evaluate the integrand.The choice = 0, i.e. ni = t
ni1, leads to the Ito-integral. The choice = 1/2, i.e.
ni = (tni1 + t
ni )/2, leads to the Stratonovich integral.
2.3.1 Random walk Construction
Here the focus is on the simulations of the values of a Wiener process(W(t1), . . . W (tn) at fixed set of points 0 < t1
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
25/156
2.3. WIENER PROCESS 25
0 0.2 0.4 0.6 0.8 10.6
0.4
0.2
0
0.2
0.4
0.6
0.8
t
W(t)
dt= 0.0025
Figure 2.1: Sample path of Wiener
the sample path of the Wiener is continuous, that is , its sample paths are almost surely,continuous function of time. We can see from the fact that for 0 s t we have
Var (W(t) W(s)) = E (W(t) W(s))2 E [(W(t) W(s))]2= t s
its mean and the variance grows without bound as time increases while the mean alwaysremain zero, which means that many sample paths must attain larger and larger values,
both positive and negative as time increases. However, almost all sample paths of theWiener process are nowhere-differentiable. We will not give a rigorous proof here, this iseasy to see in the mean-square sense
E
W(t + h) W(t)
h
2=E
(W(t + h) W(t))2h2
=h
h2
The properties E(W(t)(W(s)) = min(s, t) can be used to demonstrate the independenceof the Wiener increments. Let us assume that the time interval : 0 t0 < . . . < ti1 s satisfies the following PDEs;
p
t 1
2
2p
y2= 0, (s, x) fixed (2.7)
p
s+
1
2
2p
x2= 0, (t, y) fixed (2.8)
The first equation is an example of a heat equations which describes the variations intemperature as heat passes through a physical medium. The standard Wiener processserves as a prototypical example of a (stochastic) diffusion process. Diffusion processes,which we now define in one dimensional case, are a rich and useful class of Markov processes.
Definition 21 Diffusion process
A Markov process with transition densities p(s, x; t, y) is called a diffusion if the followingthree limits exist:
(i) For any for all > 0,s 0 and x
limts
1
t s|xy|>
p(s, x; t, y)dy = 0, (2.9)
the condition (2.9) tells that it is very unlikely that the process X(t) undergoes largechanges in a short period of time.
(ii) There exist functions (x, t) and (x, t) such that for all > 0, t [0, T] andx (, )(a)
limts
1
t s|yx|
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
30/156
30 CHAPTER 2. CONDITIONAL PROBABILITY AND EXPECTATION
(b)
limts
1
t s |yx|
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
31/156
Chapter 3
Stochastic Integrals
We say that a stochastic process X is a diffusion it its local dynamics can be approximatedby a Stochastic differential equation of the type.
X(t + t) X(t) = (t, X(t))t + (t, X(t))Z(t) (3.1)Example of a processes that can be described by diffusion processes are Asset prices,position of a moving particle such as a pollutant in air or water. Where
Z(t) is a normally distributed disturbance term which is independent of everythingwhich has happened up to time t. Intuitively equation (3.1) is that over the interval[t, t + t], the process X is driven by following two separate terms
(t, X(t)) is a locally deterministic velocity which is called the drift term
(t, X(t)) is a locally deterministic amplification factor of the Gaussian disturbanceZ(t) which is called the diffusion term.
In equation (3.1) we replace the disturbance term Z(t) by
W(t) = W(t + t) W(t)where W = (W(t) : t 0) is the Wiener process and get
X(t + t) X(t) = (t, X(t))t + (t, X(t))W(t). (3.2)
How to interpret equation (3.2)?
1. Fix and divide by t and let t tend to 0. We obtain:
X(t, ) = (t, X(t, )) + (t, X(t, ))v(t, )
where v(t, ) is the time derivative of a path of the Wiener process. This Stochasticdifferential equations can be solved in principle. But a path of the Wiener process isnowhere differentiable so this does not work.
31
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
32/156
32 CHAPTER 3. STOCHASTIC INTEGRALS
2. Fix again and let t tend to 0 without dividing by t :
dX(t, ) = (t, X(t, ))dt + (t, X(t, ))dW(t, ).
Interpret this equation as a shorthand version of the integral equation
X(t, ) X(0, ) =t0
(s, X(s, ))ds +
t0
(s, X(s, ))dW(s, ).
The first integral can be interpreted as a Riemann integral and the second as aRiemann-Stieltjes integral. But a path of the Wiener process is of unbounded varia-tion so this does not work either.
We have seen so far that there are problems with an interpretation to equation (3.2) for eachtrajectory of the Wiener process separately. Therefore, we will give a global construction
for integrals of the form: T0
g(s)dW(s), (3.3)
for a class of (Ft)-adapted integrands g = (g(t))t0, also defined on .Consider first a simple, non-random integrand:
g(t) =
c0 if t = 0cii if ti1 < t ti, i = 1, . . . , n = c010(t) +
ni=1
ci11(ti1,ti](t),
where 0 = t0 < t1 < . . . , tn = T and c0, c1, . . . , cn1 R. Define:T0
g(s)dW(s) =ni=1
ci1(W(ti) W(ti1)).
Var
T0
g(s)dW(s)
= Var
ni=1
ci1(W(ti) W(ti1))
.
Var
T0
g(s)dW(s)
=
n
i=1Var (ci1(W(ti) W(ti1))) .
VarT0
g(s)dW(s)
=ni=1
c2i1Var (W(ti) W(ti1)) .
So the outcome of the integral is a random variable defined on with mean 0 and variance
Var
T0
g(s)dW(s)
=
ni=1
c2i1(ti ti1).
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
33/156
33
Example 3 Let g(t) = 2 for 0 t 1, g(t) = 2 for 1 < t 2, and g(t) =3 and 2 < t 3 Then (note that ti = 0, 1, 2, 3, ci1 = g(ti), c0 = 2, c1 = 2, c2 = 3). Findthe mean and variance of the following integral:
Solution 2
30
g(t)dW(t) = c0 (W(1) W(0)) + c1 (W(2) W(1) + c2 (W(3) W(2))) (3.4)= 2W(1) + 2 (W(2) W(1) + 3 (W(3) W(2))) (3.5)
The distribution of the integral (3.4) is N[0, 17] comes from the direct sum of N[0, 4] +N[0, 4] + N[0, 9].
In order to get random process it is vital to replace the constants ci1 with random variablesi1 and let the random variable i1 depend on the values W(t) for t ti1 but not onfuture values of W(t) for t > ti1. To get an adapted integrand, we have to assume that ifFt is the -field generated by Brownian motion up to time t, then i1 is Fti1-measurable:
g(t) = 010(t) +ni=1
i11(ti1,ti](t).
Define, as before
T0
g(s)dW(s) =ni=1
i1(W(ti) W(ti1)).
We get
E
T0
g(s)dW(s)
=
ni=1
E
i1(W(ti) W(ti1)) | Fti1
.
Using the Fti1-measurability of i1 and the independent increments property, we get
E[i1(W(ti) W(ti1) | Fti1)] = i1E[W(ti) W(ti1)] = 0,
and it follows that
E
T0
g(s)dW(s)
= 0.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
34/156
34 CHAPTER 3. STOCHASTIC INTEGRALS
If we also assume that E[2i1] < we get that
ET
0
g(s)dW(s)2
= 2
ni=1
j
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
35/156
35
To complete the construction of the integral, we introduce the space 2[0, T] of (equiva-lence classes of) (Ft)-adapted processes g = (g(t))t0 satisfying
T0
E[g(s)2]ds < .
For a general process g 2[0, T] which is not simple we can find a sequence (gn) ofsimple processes such that T
0
E[{gn(s) g(s)}2]ds 0.
Since
{gn(s) gm(s)}2 2{gn(s) g(s)}2 + 2{g(s) gm(s)}2,it follows that T
0
E[{gn(s) gm(s)}2]ds 0.
Now gn gm is simple, so formula (3.6) impliesT0
E[{gn(s) gm(s)}2]ds
= E
T0
(gn(s) gm(s))dW(s)2
= E
T0
gn(s)dW(s) T0
gm(s)dW(s)
2.
Define the random variable Z L2 :
Zn =
T0
gn(s)dW(s).
It follows that
E
(Zn Zm)2
0.
One can find a random variable Z L2 such that Zn Z in L2 :limn
E
(Zn Z)2 0.
We defineT0
g(s)dW(s) = Z. If Zn Z in L2, then limn E[Zn] = E[Z] and limn E[Z2n] =E[Z2]. It follows that
E
T0
g(s)dW(s)
= 0,
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
36/156
36 CHAPTER 3. STOCHASTIC INTEGRALS
and
ET
0
g(s)dW(s)2
= T
0
E[g(s)2]ds.
Remark. By this procedure we have defined the stochastic integralT0 g(s)dW(s) as a
random variable in the space L2(, FT, P). It is in general not true that the sequence ofrandom variables
T0
gn(s)dW(s), n = 1, 2, . . . converges P-a.s..Note that in the following stochastic integrals can be evaluated and shown to be;b
a
W(t)dW(t)Ito=
1
2
W2(b) W2(a)+ ( 1
2)(b a) (3.7)
and ba
W(t)dW(t)Str=
1
2
W2(b) W2(a)
It is known that that stochastic calculus is about systems driven by white noise. Integralsinvolving white noise may be expressed as:
Y(T) =
T0
F(t)dW(t)
an Ito integral when F is random but adapted. The Ito integral like Riemann has adefinition as a certain limit.
The fundamental theorem of of calculus allows people to evaluate the integral withoutcoming back to the original definition.Itos formula plays that role for Ito integral.Itos formula has an extra term not present in the fundamental theorem that is due to thenon smoothness of Brownian motion paths.If we let Ft be filtration generated by Brownian motion up to time t, andF(t) Ft be an adapted stochastic process. Corresponding to the Riemann sum approxi-mate to Riemann integral we define the following approximation to the Ito integralIto integral
Yt(t) = tk
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
37/156
37
E [F(tk)Wk
| Ftk ] = 0
That is tk
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
38/156
38 CHAPTER 3. STOCHASTIC INTEGRALS
Example 4 Ito integralLetF(t) be a random function W(t), then
Y(T) =
T0
W(t)dW(t)
Note that if W(t) were differentiable with respect to t its derivative would be W(t) thelimit of(3.8) could be calculated using dW(t) = W(t)dt and that could wrongly lead to the
following equation:
T
0
W(t)dW(t) =1
2 t
0
s W2(s)ds =1
2
W(t)2
The when use definition(3.8) we get a different expression with the actual rough path ofBrownian motion.
The steps are as follows:
Write Brownian motion such that
W(tk) =1
2[W(tk+1) + W(tk)] 1
2[W(tk+1) W(tk)]
and we put in the Ito sum(3.8) to get
Yt(tn) =k
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
39/156
39
and variance , that is
Var[1
2(W(tk+1)
W(tk))
2] =1
4
Var[(W(tk+1)
W(tk))
2]
=3
4(t)2
t
2
2=
(t)2
2
As a consequence, the sum is a random variable with mean
nt
2=
tn2
and variancen(t)2
2=
tnt
2.
This implies that
1
2 tk
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
40/156
40 CHAPTER 3. STOCHASTIC INTEGRALS
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
41/156
Chapter 4
Ito Integral Process
Let g be a very simple process:
g(s, ) = 1A()1]u,v](s),
with A Fu. Define for t 0:
I(t) =
t0
g(s)dW(s) =
0 if t u1A(W(t) W(u)) if u < t v1A(W(v) W(u)) if t > v.
The process (I(t))t0 is an (Ft)-martingale. In general we have the following result: forany process g 2, the process (X(t)), defined by:
X(t) =
t0
g(s)dW(s)
is an (Ft)-martingale.Consider for a process g 2, the process (X(t)), defined by:
X(t) =
t0
g(s)dW(s).
Let [X, X](t) be the quadratic variation of X over [0, t] :
[X, X](t) = limn |X(tni ) X(tni1)|2.In Section 2.3 we derived the quadratic variation of W :
[W, W](t) = t.
Consider the example
g(t) = 01[0, 12)(t) + 11( 1
2,1](t).
41
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
42/156
42 CHAPTER 4. ITO INTEGRAL PROCESS
Then
X(t) = t
0
g(s)dW(s) = 0W(t) if t 12
0W(1
2) + 1(W(t) W(1
2)) if t >1
2
Then for example for t 1/2, we get
[X, X](t) = limn
20i
(W(tni t) W(tni1 t))2 = 20 [W, W](t) = 20t.
In general, we have for g 2:
[X, X](t) =
t0
g2(s)ds.
Written in differential form,
(dX(t))2 = g2(t)dt.
Let X = (X(t))t0 be a stochastic process and assume that there exist a real number x0and adapted processes = ((t)) and = ((t)) such that
T0
|(t)|dt < and 2such that for all t 0
X(t) = x0 +
t0
(s)ds +
t0
(s)dW(s). (4.1)
Such a process X is also called an Ito process. The processes and in the representation
(4.1) are unique a.s. To see this, let
X(t) = x0 +
t0
(s)ds +
t0
(s)dW(s) = x0 +
t0
(s)ds +
t0
(s)dW(s).
Then x0 = x0 and t0
((s) (s))ds =t0
((s) (s))dW(s).
Let M(t) =
t
0((s) (s))ds. It follows that M is a martingale with finite variation sincei
|M(tni ) M(tni1| T0
|(s)|ds +T0
|(s)|ds < .
So, the quadratic variation of M is 0, see Section 2.3. Note that for 0 s < tE[(M(t) M(s))2] = E[M2(t)] 2E[M(s)M(t)] + E[M2(s)]
= E[M2(t)] 2E[M(s)E(M(t) | Fs)] + E[M2(s)]= E[M2(t)] E[M2(s)].
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
43/156
43
So, using this and monotone convergence
0 = E[limn i |M(t
ni )
M(tni
1)
|2] = lim
n i E(|M(tni )
M(tni
1)
|2)
= limn
i
{E[(M2(tni )] E[M2(tni1)]}
= E[M2(T)].
It follows that M(T) = 0 a.s. and M(t) = E(M(T) | Ft) = 0 a.s. for all t. So = a.s.and it follows that
t0
((s) (s))dW(s) = 0 for all t. Hence
0 = E
t0
((s) (s))dW(s)2
=
t0
E[((s) (s))2]ds,
and this implies = a.s.Let X be an Ito process with representation
X(t) = x0 +
t0
(s, Xs)ds +
t0
(s, Xs)dW(s).
Usually we write this equation in differential form:
dX(t) = (t, Xt)dt + (t, Xt)dW(t), X(0) = x0, (4.2)
Or
X(t) = x0 +
t0
(s)ds +
t0
(s)dW(s).
Usually we write this equation in differential form:
dX(t) = (t)dt + (t)dW(t), X(0) = x0,
OrNote that a stochastic process X, having a stochastic differential, is a martingale if andonly if the stochastic differential has the form
dX(t) = g(t)dW(t),
i.e. X has no dt term.Note that the quantity (t, x) is called the drift of the diffusion process and (s, x) itsdiffusion coefficient at time t and position x in eqn. (4.2) implies that
(t, x) = limts
1
t sE ([X(t) X(s) | X(s) = x]) (4.3)
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
44/156
44 CHAPTER 4. ITO INTEGRAL PROCESS
so drift (s, x) is the instantaneously rate of change in the mean of the process given thatX(s) = x. Similarly
2(s, x) = limts
1
t sE
[(X(t) X(s))2 | X(s) = x] (4.4)Thus that the squared diffusion coefficient denotes the instantaneously rate of change ofthe squared fluctuations of the processes given that X(t) = x.When the drift and the diffusion coefficient of a diffusion process are sufficiently smoothfunctions, the transition density p(s, x; t, y) also satisfies the following partial differentialequations.
p
t
+
y {a(t, y)p
}
1
2
2
y2
{2(t, y)p
}= 0, (s, x) fixed (4.5)
p
s+ (s, x)
p
x+
1
22(s, x)
2p
x2= 0, (t, y) fixed (4.6)
with the former equation (4.5) giving the forward evolution with respect to the final state(t, y) and the latter equation (4.6) giving the backward evolution with respect to the initialstate (s, x). The forward equation (4.5) is commonly called the Fokker-Planck equation,especially by physicists and engineers.
4.1 Motivation and problem formulation
Many physical problems and time varying behaviours are described by the deterministicordinary differential equation(ODE). For instance, When the state of the physical systemis denoted as x(t) we obtain the following ordinary differential equation:
dx
dt= f(t, x), x(t0) = x(0), (4.7)
as a degenerate form of a stochastic differential equation, which is about to be definedin the absence of uncertainties. The detailed and thorough introduction to the theory of
stochastic differential equations can be found in (ksendal (2003) [15]).The differential equation (4.7) can be written as
dx(t) = f(x, t)dt
and it can be written in the integral form as follows
x(t) = x(0) +
tt0
f(s, x(s))ds.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
45/156
4.1. MOTIVATION AND PROBLEM FORMULATION 45
It follows that x(t) = x(t|x0, t0) is a solution with initial condition x(t0) = x0. Nevertheless,when there are uncertainty, physical system behaviour often can be described in terms ofthe probability and must be described by means of a stochastic model. Therefore in this
chapter we discuss stochastic differential equation as a model for a stochastic process X(t).Roughly, we can think of a stochastic differential equation(SDE) as an ordinary differentialequation(ODE) with an added random perturbation in the dynamics. If{X(t), t 0}, isreal valued process describing the state of a system at each time t, the stochastic differentialequation(SDE) governing the time evolution of this process X is given by
dX(t)
dt= f(t, X(t)) + g(t, X(t))(t), X(t0) = x0. (4.8)
The white noise (t) is a stochastic process. It is introduced so as to model uncertaintiesin the underlying deterministic differential equation. Generally (t) is understood in the
engineering literature as stationary Gaussian process < t < . The initial conditionis also assumed to be a random variable and independent of (t). Essentially, equation (4.8)should be a Markov. This implies that future behaviour of the state of the systems dependonly on the present state if it is known and not on its past (Arnold 1974),ksendal 2003).The SDEs such as (4.8) arise when a variety of random dynamic phenomena in the physical,biological, engineering and social sciences are modelled. Solutions of these equations areoften of diffusion processes and hence are connected to the subject of partial differentialequations(PDEs). For instance, let us now consider the model for Biochemical-OxygenDemand (BOD) in stream bodies as described by the following equation:
dB
dt=
K1B + s1, B(t0) = B0
where the deterministic process B(t) is the BOD (mg/l), K1 is the reaction rate coefficient(l/day) and s1 is the source or sink along the stream Let us suppose that there are uncer-tainties associated with the source input s1. This can be modeled by adding a white noiseprocess t with intensity to s1. The resulting stochastic model for the stochastic processBt now becomes:
dBtdt
= K1Bt + s1 + t, Bt0 = B0Another source of uncertainty can be the parameter K1. Adding a white noise process tothis parameter results in:
dBtdt
= (K1 + t)Bt + s1, Bt0 = B0Note that both stochastic models are of the general type (4.8) .An essential property of the stochastic model(4.8) is that it should be Markov. Thisproperty implies that information on the probability density of the state Xt at time t issufficient for computing model predictions for times > t. If the model is not Markovian,information on the system state for times < t would also be required. This would make themodel very impractical. As we will show in this Chapter the stochastic differential equation
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
46/156
46 CHAPTER 4. ITO INTEGRAL PROCESS
(4.8) is Markovian ift is a continuous Gaussian white noise process where assume that thestochastic process (t) is Gaussian, i.e. for all t0 t1 . . . tn T the random vectorZ = ((t1), . . . , (tn))
Rn has a normal distribution. Furthermore assume that (t) is a
white noise process, i.e. (t) satisfies the following conditions
E((t)) = 0
E((t1)(t2)) = (t2 t1), t2 t1
where (t) is the Dirac delta function. The name white noise comes from the fact thatsuch a process has a spectrum in which all frequencies participate with the same intensity,which is characteristic of white light.
As in [?], for example let us consider 1-dimensional white process. White noise has aconstant spectral density f() on the entire real axis. IfE[(s)(t + s)] = C(t) is thecovariance function of (t), then, the spectral density is given:
f() =1
2
eitC(t)dt =c
2, 1. (4.9)
The positive constant c without loss of generality can take a value equals 1. White noise(t) can be approximated by an ordinary stationary Gaussian process X(t), for exampleone with covariance:
C(t) = aeb|t|, (a > 0, b > 0),
it can be shown that such a process has a spectral density.
f() =ab
(b2 + 2).
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
47/156
4.1. MOTIVATION AND PROBLEM FORMULATION 47
f() =1
2
eitC(t)dt
=
a
2
eb
|t
|eit
dt
=a
2
eb|t| [cos(t) i sin(t)] dt
=a
2
eb|t| cos(t)dt even
a2
ieb|t| sin()dt odd
=2a
2
0
ebt cos(t)dt
=a
0
ebt cos(t)dt
=a
bb2 + 2
ebt cos(t) +
b2 + 2ebt sin(t)
0
=a
bb2 + 2
ebt cos(t)0
=a
0 +
b
b2 + 2
f() =
ab
(b2 + 2). (4.10)
If we now let a and b approach in such a way thatab 12 , we get
f() 12
, C(t) =
0 t = 0,
t = 0,
C(t)dt 1,
so that C(t) (t), that is, X(t) converges in a certain sense to (t) [9].
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
48/156
48 CHAPTER 4. ITO INTEGRAL PROCESS
0 0.2 0.4 0.6 0.8 13
2
1
0
1
2
3
time
whitenoise
0 0.2 0.4 0.6 0.8 11.5
1
0.5
0
0.5
1
1.
time
ienermotion
(a) White noise (b) Wiener motion
Figure 4.1: (a) A white noise process is shown and is discontinuous in any point, it cannotbe integrated in the sense of Lebesgue or Riemann integrals. The Brownian motion (orWiener process) (b) can be considered as a formal integral of the white noise process.
The following MATLAB program produces a white noise and Wiener track see Figure 4.1.
clear
t=0:0.002:1; l=size(t); white_noise=randn(l(1),l(2)); white_noise(1)=0;
figure(1); plot(t,white_noise);
set(gca,FontName,Times New Roman,FontSize,16);
x=xlabel(time); set(x,FontName,Times New Roman,FontSize,16);
y=ylabel(white noise);
set(y,FontName,Times New Roman,FontSize,16);
figure(2); dt=t(2)-t(1); wiener_process=zeros(l(1),l(2));wiener_process(1)=0;
for k=2:l(2)
wiener_process(k)=wiener_process(k-1)+sqrt(dt)*white_noise(k);
end
plot(t,wiener_process);
set(gca,FontName,Times New Roman,FontSize,16,...
YLim,[-1.5 1.5]);
x=xlabel(time);
set(x,FontName,Times New Roman,FontSize,16);
y=ylabel(Wiener motion);
set(y,FontName,Times New Roman,FontSize,16);
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
49/156
4.2. STOCHASTIC DIFFERENTIAL EQUATIONS 49
Therefore the formal integration of the equation (4.8) leads us to the equation
X(t) = X(t0) +
t
t0
f(s, X(s))ds +
t
t0
g(s, X(s))(s)ds (4.11)
However, it is impossible to find the last integral in (4.11) using only the standard mathe-matical instruments known from the real analysis [9]. The new mathematical theory thatallows to solve the equation (4.11) was developed in the middle of the last century by It o,K, Stratonovich, R.L.Formally the white noise process (t) it considered as a derivative of Brownian motionW(t) (see, for instance, Jazwinski 1970 [9]).
dW(t) = (t)dt (4.12)
Equation (4.11) can be written in the formdX(t) = f(t, X(t))dt + g(t, X(t))dW(t) (4.13)
or in the integral form Equation (4.13) is called the stochastic differential equation.
X(t) = X(t0) +
tt0
f(sX(s))ds +
tt0
g(s, X(s))dW(s) (4.14)
4.2 Stochastic Differential Equations
Let M(n, d) denote the class of n d-matrices. Consider as given A d-dimensional Wiener process W; A function : R+ Rn Rn; A function : R+ Rn M(n, d); A vector x0 Rn.
Consider the stochastic differential equation (SDE)
dX(t) = (t, X(t))dt + (t, X(t))dW(t). (4.15)
The process X = (X(t)) is called a strong solution of the SDE (4.15) if for all t > 0, X(t)is a function F(t, (W(s), s t)) of the given Wiener process W, integrals t
0(s, X(s))ds
andt0
(s, X(s))dW(s) exist, and the integral equation
X(t) = X(0) +
t0
(s, X(s))ds +
t0
(s, X(s))dW(s)
is satisfied.If the coefficients and satisfy the following conditions
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
50/156
50 CHAPTER 4. ITO INTEGRAL PROCESS
1. A Lipschitz condition in x and y.
K, x Rn, y Rn, t 0 :
(t, x) (t, y) + (t, x) (t, y) Kx y2. A linear growth condition:
K, x Rn, t 0 :(t, x)) + (t, x) K(1 + x)
3. x0 is a constant.
then there exists a unique strong solution X to the stochastic differential equation (4.15)with continuous trajectories and there exists a constant C such that
E[Xt2] CeCt(1 + x02).
4.3 Linear Stochastic Differential Equations
The Ito stochastic integral provides us with the means for formulating the stochastic dif-ferential equations. Such equations describe the dynamics of many importants continuoustime stochastic system.
General Linear Stochastic differential equation
In general the Linear SDEs are written in the following form:
dXt = (a1(t)Xt + a2(t)) dt + (b1(t)Xt + b2(t)) dWt (4.16)
The linear SDE is autonomous if all coefficients are constants The linear SDE is homogeneous if a2(t) = 0 and b2(t) = 0. The SDE is linear in the additive sense ifb1(t) = 0. This implies that the Ito integral
looks like t
0gdWs
The SDE is linear in the multiplicative sense if b2(t) = 0. This implies that the Itointegral looks like
t0 XsgdWs
The general solution to a linear SDE with additive noise:
dXt = (a1(t)Xt + a2(t)) dt + (a1(t)Xt + a2(t)) dWt
are well detailed in the following books [13, 12], for example.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
51/156
4.3. LINEAR STOCHASTIC DIFFERENTIAL EQUATIONS 51
Types of solution to Stochastic differential equation
Under some regularity conditions on f and g to the SDE (4.16) is a diffusion process The solution is a strong solution if it is valid for each given Wiener process (and
initial value), that is it is sample pathwise unique.
A strong solution is an adapted function X(W(t), t) where Brownian motion pathW(t) again plays the major task of abstract random variable , X(t),that is X(W(t), t)is being measurable in Ft implying that X(t) is a function of values W(s) for0 s t.For example X(t) = e(a1
b212)t+b1W(t) is a strong solution of Geometric Browian motion
from equation (4.16) when a1(t) = a1 and b2(t) = b2 and a2(t) = b2(t) = 0. Notethat it depends only on W(t), while X(t) =
t
0e(ts)dW(s) is a strong solution
of the Ornstein-Uhlenback equation dX(t) = X(t)dt + dW(t) where X(0) = 0.This solution depends on the whole path up to time t. A diffusion process with its transition density satisfying the Fokker-Planck equation
is a solution of SDE.
A solution is a weak solution if it is valid for given coefficients, but unspecified Wienerprocess, that is its probability law is unique.
That is, a weak solution is a stochastic process X(t) defined perhaps on a differentprobability space and filtration that the statistical properties of the SDE
dX(t) = (X(t), t)dt + (X(t), t)dW(t)
where roughly speaking the strong solution also satisfies the following
E [(X(t + t) X(t)) | Ft] = (X(t), t)t + O(t) (4.17)E
(X(t + t) X(t))2 | Ft
= 2(X(t), t)t + O(t) (4.18)
Thus the strong solution is a weak solution but not the other way around, since forweak solution we have no information on how or even whether the weak solutiondepend on W(t).
Exercise 2 Given the following SDE
dX = Xdt + dW(t), X(0) = x0. (4.19)
The solution Xt of the equation 6.43 is a diffusion process that may be used to describea range of problems, depending on the interpretation of the variables incorporated in themodel. For instance, it may represent the location of a particle, initially released at agiven point, as a function of time, or it may describe the concentration distribution ofsome colorant, released at an infinesimal droplet at the starting point. with and > 0
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
52/156
52 CHAPTER 4. ITO INTEGRAL PROCESS
constants. The initial distribution of variable X is given by a delta-peak located at x0. Fornegative values of , the process is one of the Ornstein-Uhlenbeck type, and will convergeto a stable distribution. When equals zero, the model reduces to a scaled version of the
Wiener process. The process Xt is Markov; so, in order to determine the value of xt+1, aninstance of the process X at time t + 1, we only need to know xt.Since the process is Gaussian that is X(t) N(etx0, 2(t)), then generally we are interestedin the mean and variance of the process. Therefore
(a) Show that the expectation of the process Xt is :
E{X(t)} = etx0, (4.20)
and
(b) a variance of Xt is2(t) = var{X(t)} = 2 e
2t 12
. (4.21)
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
53/156
Chapter 5
Itos Formula
Let X be an Ito process with stochastic differential
dX(t) = (t)dt + (t)dW(t).
Assume now further that we are given a C1,2 function f : R+ R R. Define a newprocess Z by
Z(t) = f(t, X(t)).
Then Z has a stochastic differential given by
df(t, X(t)) =f
tdt +
f
xdX(t) +
1
2
2f
x2[dX(t)]2
= ft + fx + 1222f
x2 dt + fx dW(t), (5.1)
where the term fx
is shorthand for
(t)f
x(t, X(t))
and so on. Note that formally
[dX(t)]2 = [dt + dW(t)]2 = 2[dt]2 + 2[dt][dW(t)] + 2[dW(t)]2 = 2dt,
where we used the following multiplication table
dt dW(t)dt 0 0
dW(t) 0 dt
In the special case where the function f : R R is twice differentiable, we get:
df(X(t)) =
f(X(t)) +
1
22f(X(t))
dt + f(X(t))dW(t).
To check if this is really the case, consider the following example:
53
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
54/156
54 CHAPTER 5. ITOS FORMULA
Example 5 As an example, we use Itos formula to calculateE[eW(t)].
deW(t) =1
2
2eW(t)dt + eW(t)dW(t),
or in integrated form
eW(t) = 1 +1
22t0
eW(s)ds +
t0
eW(s)dW(s).
Taking expected values will make the stochastic integral vanish.
E[eW(t)] = 1 +1
22t0
E[eW(s)]ds.
Define m(t) =E
[eW(t)
], then
m(t) = 1 +1
22t0
m(s)ds.
Taking the derivative with respect to tm(t) = 1
22m(t),
m(0) = 1.
Solving this equation we get
E[eW(t)
] = e2t/2
.
Exercise 3 Evaluate the following
(a)
E[e(WtWs)]
(b)
E[e(t+12Wt)]
Example 6 Let us now use the integral
I =
t0
WsdWs
choose Xt = Wt and g(t, Xt) =12
X2t . It follows that
Yt = g(t, Xt) =1
2W2t
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
55/156
55
Then by Ito formula (5.1)
dYt =
g
t dt +
g
x dWt +
1
2
2g
x2 (dWt)2
= WtdWt +
1
2dt.
Hence
d(1
2W2t ) = WtdWt +
1
2dt.
In other words
1
2W2t =
t0
WsdWs +1
2t.
which means, t0
WsdWs =1
2W2t
1
2t.
Example 7 Let us now consider the population growth model as it explained in chapterfive of the book [15] where
dNtdt
= atNt, where N0is given (5.2)
In which we choose at = rt + t, i.e., we include the uncertainty in the model. Let usassume that rt = r constant. By the Ito interpretation, equation (5.2) is equivalent to
dNt = rNtdt + NtdWt (5.3)
equivalently
dNtNt
= rdt + dWt (5.4)
It follows that
t0
dNsNs
= rt + Wt, where W0 = 0
One can see that the evaluation of the integral on the left hand side requires the use of theIto formula for the function
g(t, x) = ln x; x > 0
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
56/156
56 CHAPTER 5. ITOS FORMULA
In this case get
d(ln Nt) =1
Nt dNt +
1
2 1
N2t (dNt)2=
dNtNt
12N2t
2N2t dt
=dNtNt
12
2dt
Therefore
dNtNt
= d(ln Nt) +1
22dt
Now if you equate this equation by equation (5.4) we find that
d(ln Nt) +1
22dt = rdt + dWt
It follows that
d(ln Nt) =
r 1
22
dt + dWt
t0
d(ln Ns) =
t0
r 1
22
ds +
t0
dWs
ln
NtN0
=
r 1
22
t + Wt
Hence
Nt = N0e(r1
22)t+Wt (5.5)
For Stratonovich interpretation, we have
dNt = rNtdt + NtdWt
dNt
Nt= rdt + dWt (5.6)
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
57/156
57
t
0
dNs
Ns=
t
0
rds + t
0
dWs (5.7)
direct integration of (5.7) gives the Stratonovich solution Nt:
Nt = N0ert+Wt (5.8)
The solutions Nt and Nt are both processes of the type
Xt = X0ert+Wt
Such kind of processes are called geometric Brownian motions. They are important modelsin stochastic prices in economics.
Note that it seems reasonable that if Wt is independent of N0 we should have
E[Nt] = E[N0]ert
that is the same when there is no noise in at in equation (5.2). therefore as anticipated weobtain
E[Nt] = E[N0]ert (5.9)
but for Stratonovich solution however, the same calculation gives
E[Nt] = E[N0]e(r+ 1
2)t (5.10)
The explicit solutions [Nt] and [Nt] in (5.5) and (5.8) respectively can be analysed byusing our knowledge about the behaviour ofWt to gain information on these solutions. Forexample, if we consider the Ito solution Nt i.e equation (5.5) we see that
(a) Ifr > 122 then Nt , a.s
(b) Ifr 0. Thus the two solutions have fundamentally different properties and itis an interesting question what solution gives the best description of the process/situation.
Remark 1 There is a fundamental Ito formula (see Arnold (1974)), in the stochasticcalculus for example;The differential form of the Ito formula applied to the function W(t) Wn(t) for aninteger n 1 and t 0 gives
d (Wnt ) = nWn1t dWt +
n(n 1)2
Wn2t dt
Exercise 4 Use Ito formula to prove that
t0
W2s dWs = 13W2t t
0
Wsds
Hint choose n = 3 and apply in the above formula.
5.1 The Multidimensional Ito Formula
When you encounter a situation in higher dimensional case we consider a vector of stochas-tic process
X =
X1......
Xn
,
where the component Xi has a stochastic differential
dXi(t) = i(t)dt +d
j=1ij(t)dWj(t),
with W1, . . . , W d independent Wiener processes. Define the drift term and the d-dimensional Wiener process W by
=
1......
n
and W =
W1...
Wd
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
60/156
60 CHAPTER 5. ITOS FORMULA
respectively, and the n d diffusion matrix by
=
11
1d
... ... ...
......
...n1 nd
.
So in matrix notation we have:
dX(t) = (t)dt + dW(t).
Let f : R+ Rn R be a C1,2 mapping. Define a new process Z by
Z(t) = f(t, X(t)).
Then Z has a stochastic differential given by
df(t, X(t)) =f
tdt +
ni=1
f
xidXi +
1
2
ni=1
nj=1
2f
xixjdXidXj,
with multiplication table
dt dWi(t)dt 0 0
dWj(t) 0 ijdt
with ij = 1 if i = j0 if i
= j
.
It follows that
dXidXj = (idt +dk=1
ikdWk)(jdt +dl=1
jldWl)
=dk=1
ikjkdt
= Cijdt,
where C is the n n-matrixC = T.
So, we can write
df(t, X(t)) =
f
t+
ni=1
if
xi+
1
2
ni,j=1
Cij2f
xixj
dt +
ni=1
f
xi
dj=1
ijdWj (5.11)
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
61/156
5.2. APPLICATIONS OF ITO FORMULA 61
5.2 Applications of Ito formula
In this part we present various examples of Stochastic differential equation and how their
solutions can be found by the aid of Ito formula.
5.2.1 Examples of Linear SDEs with additive noise
Let us consider the case which models the molecular bombardment of a speck of dust onwater surface responsible for Brownian motion.The intensity of this bombardment does not depend on the state variables, for instancethe position of the speck.Taking Xt as one of the components of the velocity of the particle, Langevin equation(see [13] can be written as follow
dXtdt
= aXt + b (5.12)
for the acceleration of the particle, that is the sum of the retarding frictional force dependingon the velocity and the molecular forces represented by a white noise process , with theintensity b independent of the velocity of the particle. Where in this case a and b arepositive constants.The linear Equation (5.12) can be written as an Ito SDE
dXt =
aXtdt + bdWt (5.13)
or in the stochastic integral as
X(t) = X(0) t0
Xsds +
t0
bdWs (5.14)
Where the second integral is an Ito integral. Such kind of a process is called Ornstein-Uhlenbeck process.Note that in this kind of SDEs it does not matter whether you choose Ito or Stratonovichintegral as both lead to the same process.
Equation (5.14) is said to have additive noise because the noise term does not depend onthe state of the variable of a system, it is also a linear equation. It can be shown that theequation (5.14) has the explicit solution
X(t) = eatX0 + eatt0
easbdWs
X(t) = eatX0 + bt0
ea(ts)dWs, , O t T
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
62/156
62 CHAPTER 5. ITOS FORMULA
But when there external fluctuations the intensity of the noise usually depends o the stateof the system.For example, the growth coefficient in an exponentail growth equation dx(t) = x(t)dt
may fluctuate on the account of environmental effects, taking the form = a + bt wherea and b are positive constants and t is a white noise process. As it has been shown beforethis lead to the following Ito SDE:
dXt = aXtdt + bX(t)dW(t) (5.15)
In the stochastic integral equation
X(t) = X(0) + t
0
aXsds + t
0
bXsdWs (5.16)
The second integral is again an Ito integral but in contrast with equation (5.13) itsintegrand involves the unknown solution (5.16) has a multiplicative noise. Still is is alinear equation.
1. (i) Show that the solution of the SDE
dXt = X(t)dt + dW(t)is given by
Xt = etX0 + t0
esdWs(ii) Show that the solution of the SDEs
dXt = (aX(t) + b)dt + dW(t)
given by given by
Xt = eat
X0 +
b
a
1 eat + t
0
easdWs
(iii) Show that the solution of the SDE
dXt =
b XtT t
dt + dW(t)
is given by
Xt = X0
1 t
T
+
bt
T+ (T t)
t0
1
T sdWs
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
63/156
5.2. APPLICATIONS OF ITO FORMULA 63
2. Solve the Ornstein-Uhlenbeck equation (or Langevin equation)
(a) dXt = Xtdt + dWt, Xt0 = x0
where , are real constants
The solution is called the Ornstein-Uhlenbeck process.
(b) Find E[Xt] and var[Xt] := E[(Xt E[Xt])2]
5.2.2 Examples of Linear SDEs with multiplicative noise
Examples
1. By an application of Itos formula,
X(t) = eW(t)t/2
is a strong solution of the SDE
dX(t) = X(t)dW(t), X(0) = 1.
2. Consider the SDE
dX(t) = a(t)dW(t),
where a(t) is a deterministic (non-random) C1 function. The solution must be
X(t) = X(0) +
t0
a(s)dW(s).
Integrating by parts,
X(t) = X(0) + a(t)W(t) t0
W(s)a(s)ds,
and we see that the solution is a strong solution.
3. The SDE
dX(t) = rX(t)dt + X(t)dW(t), X(0) = 1,
where r and are real constants has a strong solution given by
X(t) = e(r2/2)t+W(t).
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
64/156
64 CHAPTER 5. ITOS FORMULA
4. Consider the general linear SDE (scalar case, i.e. n = d = 1)
dX(t) = (X(t) + )dt + (X(t) + )dW(t).
The solution in the homogeneous case = = 0 is
U(t) = e(2/2)t+dW(t).
By an application of Itos formula we get:
dU1(t) = U1(t)
( + 2)dt W(t) .It follows that
d X(t)U1(t) = X(t)dU1(t) + U1(t)dX(t) + dX(t)dU1(t)= U1(t) [(+ )dt + dW(t)] .
It follows that
X(t) = U(t)
x0 + (+ )
t0
U1(s)ds + t0
U1(s)dW(s)
.
Example 8 By an application of Itos formula, it can be shown that
X(t) = eW(t)t/2
is a strong solution of the SDE
dX(t) = X(t)dW(t), X(0) = 1.
That is Let g(Xt, t) = eW(t)t/2 = g(x, t) = ext/2 then
dg(t, X(t)) =g
tdt +
g
xdW(t) +
1
2
2g
x2(dW(t))2
dg(t, X(t)) = 12
eW(t)t/2dt + eW(t)t/2dW(t) +1
2eW(t)t/2dt
dXt = 12
Xtdt + XtdW(t) +1
2Xtdt
dXt = XtdW(t)
Example 9 Using Ito formula we can verify that SDE (5.15) has the explicit solution
Xt = e(ab2/2)t+bWt .
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
65/156
5.2. APPLICATIONS OF ITO FORMULA 65
That is Let g(Xt, t) = e(ab2/2)t+bW(t) = g(x, t) = e(ab
2/2)t+bx then
dg(t, X(t)) =
g
t dt +
g
x dW(t) +
1
2
2g
x2 (dW(t))
2
dg(t, X(t)) =
a b2
2
e(ab
2/2)t+bW(t)dt + be(ab2/2)t+bW(t)dW(t) +
b2
2e(ab
2/2)t+bW(t)(dW(t))2
Hence
dXt =
a b2
2
Xtdt + b
2XtdW(t) +b2
2Xt(dW(t))
2
dX(t) = aX(t)dt + bX(t)dW(t), X(0) = 1,
while the Stratonovich would yield a different solution Xt = e(at+bWt
1.
dSt = St(t)dt + StdW(t)
using Ito calculus we can verify that
St = S0e
2
2t+Wt
2.
dSt =1
2St(t)dt + StdW(t)
using Ito calculus we can verify that
St = S0eWt
3.
dSt = StdW(t)
using Ito calculus we can verify that
St = S0eWt 12 t
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
66/156
66 CHAPTER 5. ITOS FORMULA
Example 10 Consider the SDE
dX(t) = a(t)dW(t),
where a(t) is a deterministic (non-random) C1 function. The solution must be
X(t) = X(0) +
t0
a(s)dW(s).
Integrating by parts,
X(t) = X(0) + a(t)W(t) t0
W(s)a(s)ds,
and we see that the solution is a strong solution.
Exercise 5 Consider the stochastic differential equation:
dXt = dt + dW(t).
where and are constants. This equation is well defined. The exact solution is writtenas
Xt = Xt0 + (t t0) + b(Wt Wt0), where, Xt0 = 0, Wt0 = 0Such a kind of a process is called the generalized Wiener process. The random tracks of Xtcan be generated by Using the next program where = 0.2 and = 0.3.
clear
dt=1; mu=0.2;sigma=0.3;N=100;Rn=randn(N,1);t=zeros(N,1);
W=zeros(N,1);
t(1)=1;W(1)=0;
for i=2:N
t(i)=i;
W(i)=W(i-1)+sqrt(dt)*Rn(i);
end
X=mu*t+sigma*W;
plot(t,X,-); xlabel(time t)
ylabel(X(t))
set(gca,FontName,Times New Roman,FontSize,16);x=xlabel(time t);
set(x,FontName,Times New Roman,FontSize,16);
y=ylabel(X(t));
set(y,FontName,Times New Roman,FontSize,16);
title(Simulation of a Generalized Wiener Process)
the result is depicted below
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
67/156
5.2. APPLICATIONS OF ITO FORMULA 67
0 20 40 60 80 1000
2
4
6
8
10
12
14
16
18
time t
t)
Simulation of a Generalized Wiener Process
Figure 5.2: Simulation of a Generalised Wiener Process
5.2.3 Relation between Ito and Stratonovich SDEs
The difference between the two interpretation lies in the location where the diffusion func-tion say (t, Xt) is evaluated as we shall soon see in the next part. Fortunately, transfor-mation between Ito and Stratonovich concept do exist for the model in any dimensions (seebelow and in Jazwinski 1970 [9]), for example. The same SDEs can lead to different solu-tions depending on the type of numerical numerical scheme one chooses to use. Thereforeit is essential that one defines the SDEs in a particular sense and use the corresponding nu-merical schemes (more details on the choice of numerical schemes can be found in [13, 11],for example.Therefore, if a physical system on one hand is defined by the Ito SDEs:
dX(t)Ito= (X(t), t)dt + (Xt, t)dWt (5.17)
then the same process can be described also with the Stratonovich equation:
dXtStr=
(t, Xt) 1
2(t, Xt)
(t, Xt)
x
dt + (t, Xt)dWt (5.18)
that is
dXtStr
= (t, Xt)dt + (t, Xt)dWt
where
(t, Xt) =
(t, Xt) 1
2(t, Xt)
(t, Xt)
x
On the other hand, if a physical process is described by the Stratonovich stochastic differ-ential equation:
dXtStr= (t, Xt)dt + (t, Xt)dWt (5.19)
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
68/156
68 CHAPTER 5. ITOS FORMULA
then the same process can be described also with the It o equation: or in general,
dXtIto= (t, Xt) + 12(t, Xt)
(t, Xt)
x dt + (t, Xt)dWt (5.20)that is
dXtIto= (t, Xt)dt + (t, Xt)dWt
where
(t, Xt) =
(t, Xt) +
1
2(t, Xt)
(t, Xt)
x
Note that as long as the function (X(t), t) = g(t) is only time dependent both interpre-tations will produce the same results. It is essential to note that the Stratonovich formula
agrees well with the classical differential formula unlike in the Ito formula there is noadditional term (Kloeden (1999)).
Example 11 Let us consider the following geometric Brownian processes that is oftenapplied in finance as models for stochastic prices such that Ito SDE is written as follows;
dX(t)Ito= aX(t)dt + bX(t)dW(t), Xt0 = 1, (5.21)
with the aid of Itos differential rule, and the function (x, t) = ln(x), x > 0 the followinggeneral Ito solution can be obtained.
X(t) = e
(a
b2
2)t+bW(t)
, X(t0) = 1, W(0) = 0,
where a, b are positive constants. While
dX(t)Str= aX(t)dt + bX(t)dW(t), Xt0 = 1, (5.22)
by using equation (5.20), we obtain
dX(t)Ito= X(t)(a +
b2
2)dt + bX(t)dW(t), Xt0 = 1. (5.23)
Again now with the aid of the Itos differential rule, and the function (x, t) = ln(x) the
following general Ito solutions can be obtained.
X(t)Ito= eat+bW(t), X(t0) = 1, (5.24)
While the Stratonovich Eqn. (5.22) has the Stratonovich solution:
X(t)Str= eat+bW(t), X(t0) = 1, (5.25)
which is the same solution.
7/29/2019 Stochastic Differential Equations. Introduction to Stochastic Models for Pollutants Dispersion, Epidemic and Finance
69/156
5.2. APPLICATIONS OF ITO FORMULA 69
Example 12 Consider the process Mt satisfying the stochastic differential equation:
dMt = aMtdt + adWt (5.26)Note that in this case there is no difference in the Ito and Stratonovich interpretation. Itis easy to derive that the auto covariance of this process. For exampleE[M0] = 0 andE[M20 ] =
a2
then
E[Mt] = 0
E[MtMs] =a
2ea(ts)
The solution of equation (5.26) can be shown to be:
Mt = M0eat
+ at0
ea(t+r)
dWr (5.27)
for > 0 we have t + > t : we have
Mt+ = M0ea(t+) + a
t+0
ea((t+)+s)dWs (5.28)
If > 0 we have t + > t : The autocovariance can be calculated in many ways let us usethis approach:
E[Mt+Mt] = E[M20 e
a(2t+)] + 0 + a2 t+0
ea(t+)s t0
ea(t+r)E[dWsdWr]
E[Mt+Mt] = E[M20 e
a(2t+)] + 0 + a2t+0
ea(t+)st0
ea(t+r)E[sr]dsdr
we have discussed earlier that the white noise has the property thatE[sr] = (s r)
E[Mt+Mt] = E[M20 e
a(2t+)] + a2t+0
ea(t+)st0
ea(t+r)(s r)dsdr
E[Mt+
Mt] = E[M2
0ea(2t+)] + a2
t+
0
ea(t+)s(s
r)t
0
ea(t+r)dsdr
E[Mt+Mt] = E[M20 e
a(2t+)] + a2t+0
F(s)(s r)t0
ea(t+r)dsdr (5.29)
By using the property of impulse/delta function (see Jazawinki (1970) [9]) for instance, ifF(t) is a function of t thenq
p