Lecture 13. Linear differential equations driven by rough paths

In this lecture we define solutions of linear differential equations driven by p-rough paths, p \ge 1 and present the Lyons’ continuity theorem in this setting. Let x \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d) be a p-rough path with truncated signature \sum_{k=0}^{[p]} \int_{\Delta^k [s,t]}  dx^{\otimes k}, and let x_n \in C^{1-var}([0,T],\mathbb{R}^d) be an approximating sequence such that
\sum_{j=1}^{[p]} \left\|  \int dx^{\otimes j}-   \int  dx_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \to 0.

Let us consider matrices M_1,\cdots,M_d \in \mathbb{R}^{n \times n}. We have the following theorem:

Theorem: Let y_n:[0,T] \to \mathbb{R}^n be the solution of the differential equation
y_n(t)=y(0)+\sum_{i=1}^d \int_0^t  M_i y_n(s)d x^i_n(s).
Then, when n \to \infty, y_n converges in the p-variation distance to some y \in  C^{p-var}([0,T],\mathbb{R}^n) . y is called the solution of the rough differential equation
y(t)=y(0)+\sum_{i=1}^d \int_0^t  M_i y(s)d x^i(s).

Proof: It is a classical result that the solution of the equation
y_n(t)=y(0)+\sum_{i=1}^d \int_0^t  M_i y_n(s)d x^i_n(s),
can be expanded as the convergent Volterra series:
y_n(t)=y_n(s)+ \sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[s,t]}dx_n^{I} \right) y_n(s).
Therefore, in particular, for n,m \ge 0,
y_n(t)-y_p(t)=\sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[0,t]}dx_n^{I}-  \int_{\Delta^{k}[0,t]}dx_p^{I} \right) y(0),
which implies that
\| y_n(t)-y_m(t) \| \le \sum^{+\infty}_{k=1}M^k  \left\|  \int_{\Delta^{k}[0,t]}dx_n^{\otimes k}-  \int_{\Delta^{k}[0,t]}dx_m^{\otimes k} \right\| \| y(0) \|
with M=\max \{ \| M_1 \| , \cdots , \|  M_d \| \}. From the theorems of the previous lectures, there exists a constant C \ge 0 depending only on p and
\sup_n  \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]}
such that for k \ge 1 and n,m big enough:
\left\|  \int_{\Delta^k [0,\cdot]}  dx_n^{\otimes k}-  \int_{\Delta^k [0,\cdot]}  dx_m^{\otimes k} \right\|_{p-var, [0,T]}  \le \left( \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j}-   \int  dx_m^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]}  \right) \frac{C^k}{\left( \frac{k}{p}\right)!}.
As a consequence, there exists a constant \tilde{C} such that for n,m big enough:
\| y_n(t)-y_m(t) \| \le \tilde{C}   \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j}-   \int  dx_m^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} .
This already proves that y_n converges in the supremum topology to some y. We now have
(y_n(t)-y_n(s))-(y_m(t)-y_m(s))
=\sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[s,t]}dx_n^{I}y_n(s) -\int_{\Delta^{k}[s,t]}dx_m^{I} y_m(s)\right),
and we can bound
\left\|  \int_{\Delta^{k}[s,t]}dx_n^{I}y_n(s) -\int_{\Delta^{k}[s,t]}dx_m^{I} y_m(s) \right\|
\le \left\|  \int_{\Delta^{k}[s,t]}dx_n^{I} \right\| \| y_n(s)-y_m(s) \|+\| y_m(s) \| \left\|  \int_{\Delta^{k}[s,t]}dx_n^{I} -  \int_{\Delta^{k}[s,t]}dx_m^{I}\right\|
\le  \left\|  \int_{\Delta^{k}[s,t]}dx_n^{I} \right\| \| y_n-y_m \|_{\infty, [0,T]} +\| y_m \|_{\infty, [0,T]} \left\|  \int_{\Delta^{k}[s,t]}dx_n^{I} -  \int_{\Delta^{k}[s,t]}dx_m^{I}\right\|
Again, from the theorems of the previous lectures, there exists a constant C \ge 0, depending only on p and
\sup_n  \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]}
such that for k  \ge 1 and n,m big enough
\left\|  \int_{\Delta^k [s,t]}  dx_n^{\otimes k} \right\| \le \frac{C^k}{\left( \frac{k}{p}\right)!}  \omega(s,t)^{k/p}, \quad 0 \le s \le t \le T.
\left\|  \int_{\Delta^k [s,t]}  dx_n^{\otimes k}-  \int_{\Delta^k [s,t]}  dx_m^{\otimes k} \right\|  \le \left( \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j}-   \int  dx_m^{\otimes k} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]}  \right) \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{k/p} ,
where \omega is a control such that \omega(0,T)=1. Consequently, there is a constant \tilde{C}, such that
\| (y_n(t)-y_n(s))-(y_m(t)-y_m(s)) \|
\le   \tilde{C} \left( \| y_n-y_m \|_{\infty, [0,T]} +  \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j}-   \int  dx_m^{\otimes k} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \omega(s,t)^{1/p}
This implies the estimate
\| y_n -y_m \|_{p-var,[0,T]} \le   \tilde{C} \left( \| y_n-y_m \|_{\infty, [0,T]} +  \sum_{j=1}^{[p]} \left\|  \int dx_n^{\otimes j}-   \int  dx_m^{\otimes k} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)
and thus gives the conclusion \square

With just a little more work, it is possible to prove the following stronger result whose proof is let to the reader.
Theorem: Let y_n:[0,T] \to \mathbb{R}^n be the solution of the differential equation
y_n(t)=y(0)+\sum_{i=1}^d \int_0^t  M_i y_n(s)d x^i_n(s).
and y be the solution of the rough differential equation:
y(t)=y(0)+\sum_{i=1}^d \int_0^t  M_i y(s)d x^i(s).
Then, y \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d) and when n \to \infty,
\sum_{j=1}^{[p]} \left\|  \int dy^{\otimes j}-   \int  dy_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \to 0.

We can get useful estimates for solutions of rough differential equations. For that, we need the following analysis lemma:

Proposition: For x \ge 0 and p \ge 1,
\sum_{k=0}^{+\infty} \frac{x^k}{\left( \frac{k}{p} \right)!} \le p e^{x^p}.

Proof: For \alpha \ge 0, we denote
E_\alpha(x)=\sum_{k=0}^{+\infty} \frac{x^k}{\left( k \alpha \right)!}.
This is a special function called the Mittag-Leffler function. From the binomial inequality
E_\alpha(x)^2
=\sum_{k=0}^{+\infty} \left( \sum_{j=0}^k  \frac{1}{\left( j \alpha \right)!\left( (k-j) \alpha \right)!}\right)x^k
\le \frac{1}{\alpha}\sum_{k=0}^{+\infty} 2^{\alpha k} \frac{x^k}{\left( k \alpha \right)!}=\frac{1}{\alpha}E_\alpha(2^\alpha x).
Thus we proved
E_\alpha(x)\le\frac{1}{\alpha^{1/2}}E_\alpha(2^\alpha x)^{1/2}.
Iterating this inequality, k times we obtain
E_\alpha(x)\le \frac{1}{\alpha^{\sum_{j=1}^k \frac{1}{2^j}}} E_\alpha(2^{\alpha  k}x)^{1/(2k)}.
It is known (and not difficult to prove) that
E_\alpha(x) \sim_{x \to \infty} \frac{1}{\alpha} e^{x^{1/\alpha}}.
By letting k \to \infty we conclude
E_\alpha(x) \le \frac{1}{\alpha} e^{x^{1/\alpha}}.
\square

This estimate provides the following result:

Proposition: Let y be the solution of the rough differential equation:
y(t)=y(0)+\sum_{i=1}^d \int_0^t  M_i y(s)d x^i(s).
Then, there exists a constant C depending only on p such that for 0 \le t \le T,
\| y(t) \| \le p \| y(0)\| e^{ CM  \left(  \sum_{j=1}^{[p]} \left\|  \int dx^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,t]} \right)^p},
where M=\max \{ \| M_1 \|, \cdots, \|M_d\| \}.

Proof: We have
y(t)=y(0)+ \sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[0,t]}dx^{I} \right) y(0).
Thus we obtain
y(t)\le \left( 1+ \sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M^k \left\| \int_{\Delta^{k}[0,t]}dx^{I} \right\| \right) \| y(0) \|,
and we conclude by using estimates on iterated integrals of rough paths together with the previous lemma \square

This entry was posted in Rough paths theory. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s