Lecture 26. Stochastic differential equations. Existence and uniqueness of solutions

We now turn to the theory of stochastic differential equations. Stochastic differential equations are the differential equations corresponding to the theory of the stochastic integration.

As usual, we consider a filtered probability space \left( \Omega , (\mathcal{F}_t)_{t \geq 0}, \mathcal{F},\mathbb{P} \right) which satisfies the usual conditions and on which is defined a n-dimensional Brownian motion (B_t)_{t \ge 0}. Let b :\mathbb{R}^n \to \mathbb{R}^n, and \sigma: \mathbb{R}^n \to \mathbb{R}^{ n \times n} be functions.

Theorem. Let us assume that there exists C > 0 such that
\| b(x)-b(y) \| + \| \sigma (x) - \sigma (y) \| \le C \| x-y \|,  x,y \in \mathbb{R}^n
Then, for every x_0 \in \mathbb{R}^n, there exists a unique continuous and adapted process (X_t^{x_0})_{t\ge 0} such that for t \ge 0
X_t^{x_0} =x_0 +\int_0^t b(X_s^{x_0}) ds + \int_0^t \sigma (X_s^{x_0}) dB_s.
Moreover, for every T \ge 0,
\mathbb{E} \left( \sup_{0 \le s \le T} \mid X_s \mid^2 \right) <+\infty .

Proof.
Let us first observe that from our assumptions, there exists K > 0 such that

  • \| b(x)-b(y) \| + \| \sigma (x) - \sigma (y) \| \le K \| x-y \|, x,y \in \mathbb{R}^n;
  • \| b(x) \| + \| \sigma (x)  \| \le K (1 +\| x \|), x \in \mathbb{R}^n.

The idea is to apply a fixed point theorem in a convenient Banach space.
For T > 0, let us consider the space \mathcal{E}_T of continuous and adapted processes such that
\mathbb{E} \left( \sup_{0 \le s \le T} \mid X_s \mid^2 \right) < +\infty .
We endow that space with the norm
\parallel X \parallel^2 =\mathbb{E} \left( \sup_{0 \le s \le T} \mid X_s \mid^2 \right).
It is easily seen that (\mathcal{E}_T, \parallel \cdot \parallel) is a Banach space.

Step one: We first prove that if a continuous and adapted process (X^{x_0}_t)_{t \ge 0} is a solution of the equation then, for every T >0, (X^{x_0}_t)_{0 \le t \le T} \in \mathcal{E}_T.

Let us fix T > 0 and consider for n \in \mathbb{N} the stopping times T_n =\inf \{ t \ge 0, \| X^{x_0}_t \| > n \}. For t \le T,
X_{t \wedge T_n}^{x_0} =x_0 +\int_0^{t \wedge T_n} b(X_s^{x_0}) ds + \int_0^{t \wedge T_n} \sigma (X_s^{x_0}) dB_s.
Therefore, by using the inequality \|a+b+c\|^2 \le 3 (\|a\|^2 +\|b\|^2+\|c\|^2), we get
\left\| X_{t \wedge T_n}^{x_0} \right\|^2 \le 3 \left( \|x_0\|^2 +\left\| \int_0^{t \wedge T_n} b(X_s^{x_0}) ds \right\|^2 + \left\| \int_0^{t \wedge T_n} \sigma(X_s^{x_0}) dB_s\right\|^2 \right).
Thus, we have
\mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n} \left\| X_{u}^{x_0} \right\|^2 \right)
\le 3 \left( \|x_0\|^2 + \mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n}  \left\| \int_0^{u \wedge T_n} b(X_s^{x_0}) ds \right\|^2 \right) + \mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n} \left\| \int_0^{u \wedge T_n} \sigma(X_s^{x_0}) dB_s\right\|^2 \right) \right)
By using our assumptions, we first estimate
\mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n}  \left\| \int_0^{u \wedge T_n} b(X_s^{x_0}) ds \right\|^2 \right) \le K^2 \mathbb{E} \left( \left( \int_0^{t \wedge T_n} (1+\| X_s^{x_0} \| ) ds \right)^2 \right).
By using our assumptions and Doob’s inequality, we now estimate
\mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n} \left\| \int_0^{u \wedge T_n} \sigma(X_s^{x_0}) dB_s\right\|^2 \right) \le 4 K^2 \mathbb{E} \left( \int_0^{t \wedge T_n} (1 +\| X_s \|)^2 ds \right).
Therefore, from the inequality \|a+b\|^2 \le 2 (\|a\|^2 +\|b\|^2), we get
\mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n} \left\| X_{u }^{x_0} \right\|^2 \right)
\le 3 \left( \|x_0\|^2 +2(K^2 T +4K^2) \int_0^t \left(1+\mathbb{E} \left( \sup_{0 \le u \le s \wedge T_n} \left\| X_{u \wedge T_n}^{x_0} \right\|^2 \right)ds \right)  \right).
We may now apply Gronwall’s lemma to the function t \rightarrow \mathbb{E} \left( \sup_{0 \le u \le t \wedge T_n} \left\| X_{u }^{x_0} \right\|^2 \right) and deduce
\mathbb{E} \left( \sup_{0 \le u \le T \wedge T_n} \left\| X_{u }^{x_0} \right\|^2 \right) \le C
where C is a constant that does not depend on n. Fatou’s lemma implies by passing to the limit when n \rightarrow +\infty that
\mathbb{E} \left( \sup_{0 \le u \le T} \left\| X_{u }^{x_0} \right\|^2 \right) \le C.
We conclude, as expected, that (X^{x_0}_t)_{0 \le t \le T} \in \mathcal{E}_T.
More generally, by using the same arguments we can observe that if a continuous and adapted process satisfies
X_t =X_0 +\int_0^t b(X_s) ds + \int_0^t \sigma (X_s) dB_s,
with \mathbb{E} (X_0^2) < +\infty, then (X_t)_{0 \le t \le T} \in \mathcal{E}_T.

Step 2: We now show existence and uniqueness of solutions for the equation on a time interval [0,T] where T is small enough.

Let us consider the application \Phi that sends a continuous and adapted process (X_t)_{0 \le t \le T} to the process \Phi (X)_t =x_0 +\int_0^t b(X_s) ds + \int_0^t \sigma (X_s) dB_s. By using successively the inequalities (a+b)^2 \le 2(a^2 +b^2), Cauchy-Schwarz inequality and Doob’s inequality, we get \parallel \Phi (X) - \Phi (Y) \parallel^2 \le 2(K^2T^2 +4K^2T) \parallel X - Y \parallel^2. Moreover, arguing the same way as above, we can prove \parallel \Phi (0) \parallel^2 \le 3 (x_0^2+K^2T^2 +4K^2T).
Therefore, if T is small enough \Phi is a Lipschitz map \mathcal{E}_T \rightarrow \mathcal{E}_T whose Lipshitz constant is strictly less than 1. Consequently, it has a unique fixed point. This fixed point is, of course the unique solution of the equation on the time interval [0,T]. Here again, we can observe that the same reasoning applies if x_0 is replaced by a random variable X_0 that satisfies \mathbb{E}(X_0^2)<+\infty.

Step 3.
In order to get a solution of the equation on [0,+\infty), we may apply the previous step to get a solution on intervals [T_{n},T_{n+1}], where T_{n+1}-T_n is small enough and T_n \rightarrow +\infty. This will provide a solution of the equation on [0,+\infty). This solution is unique, from the uniqueness on each interval [T_{n},T_{n+1}]
\square

Definition: An equation like in the previous theorem is called a stochastic differential equation.

Exercise: (Ornstein-Uhlenbeck process) Let \theta \in \mathbb{R}. We consider the following stochastic differential equation,
dX_t=\theta X_t dt +dB_t, \quad X_0=x.

  • Show that it admits a unique solution that is given by
    X_t= e^{\theta t} x+\int_0^t e^{\theta (t-s)} dB_s.
  • Show that (X_t)_{t \ge 0} is Gaussian process. Compute its mean and covariance function.
  • Show that if \theta <0 then, when t \to +\infty, X_t converges in distribution toward a Gaussian distribution.

Exercise.(Brownian bridge) We consider for 0\le t < 1 the following stochastic differential equation
dX_t =-\frac{X_t}{1-t}dt +dB_t, \quad X_0=0

  • Show that
    X_t=(1-t)\int_0^t \frac{dB_s}{1-s}
    is the unique solution.
  • Deduce that (X_t)_{t \ge 0} is Gaussian process. Compute its mean and covariance function.
  • Show that in L^2, when t \to 1, X_t \to 0.

Exercise. Let \mu \in \mathbb{R} and \sigma > 0. We consider the following stochastic differential equation,
dX_t =\mu X_t dt +\sigma X_t dB_t, \quad X_0=x > 0.
Show that
X_t=xe^{\sigma B_t +(\mu-\frac{\sigma^2}{2})t}.
is the unique solution.

The next proposition shows that solutions of stochastic differential equations are intrinsically related to a second order differential operator. This connection will later be investigated in more details.

Proposition. Let (X_t^x)_{t \ge 0} be the solution of a stochastic differential equation
X_t^{x_0} =x_0 +\int_0^t b(X_s^{x_0}) ds + \int_0^t \sigma(X_s^{x_0}) dB_s,
where b : \mathbb{R}^n \to \mathbb{R}^n and \sigma: \mathbb{R}^n \to \mathbb{R}^{ n \times n} are Borel functions. Let now f : \mathbb{R}^n \to \mathbb{R}^n be a C^2 function. The process
M^f_t=f(X_t^x)-\int_0^t Lf (X_s^x)ds,
is a local martingale, where L is the second order differential operator
L=\sum_{i=1}^n b_i(x) \frac{\partial}{\partial x_i} +\frac{1}{2}\sum_{i,j=1}^n a_{ij}(x) \frac{\partial^2}{\partial x_i \partial x_j} ,
and a_{ij}(x)=(\sigma(x)\sigma^*(x))_{ij}.

This entry was posted in Stochastic Calculus lectures. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s