Lecture 27. Approximation of the Brownian rough path

Our goal in the next two lectures will be to prove that rough differential equations driven by a Brownian motion seen as a p-rough path, 2 < p < 3 are nothing else but stochastic differential equations understood in the Stratonovitch sense. The proof of this fact requires an explicit approximation of the Brownian rough path in the rough path topology which is interesting in itself.

Let (B_t)_{t \ge 0} be a n-dimensional Brownian motion and let us denote by
\mathbf{B}_t=\left( B_t, \frac{1}{2} \left( \int_0^t B^i_sdB^j_s -B^j_sdB^i_s \right)_{1 \le i < j \le n} \right)
its lift in the free Carnot group of step 2 over \mathbb{R}^d.

Let us work on a fixed interval [0,T] and consider a sequence D_n of subdivisions of [0,T] such that D_{n+1} \subset D_n and whose mesh goes to 0 when n \to +\infty. An example is given by the sequence of dyadic subdivisions. The family \mathcal{F}_n =\sigma( B_t, t \in D_n ) is then a filtration, that is an increasing family of \sigma-fields. We denote by B^n the piecewise linear process which is obtained from B by interpolation along the subdivision D_n, that is for t_i^n \le t \le t_{i+1}^n,
B^n_t= \frac{t_{i+1}^n -t}{ t_{i+1}^n-t_i^n} B_{t_i} + \frac{t-t_i^n}{ t_{i+1}^n-t_i^n} B_{t_{i+1}}.
The corresponding lifted process is then
\mathbf{B}^n_t=\left( B^n_t, \frac{1}{2} \left( \int_0^t B^{n,i}_sdB^{n,j}_s -B^{n,j}_sdB^{n,i}_s \right)_{1 \le i < j \le n} \right) .
The main result of the lecture is the following:

Theorem: Let 2 < p < 3. When n \to +\infty, almost surely, d_{p-var, [0,T]}( \mathbf{B}^n , \mathbf{B}) \to 0.

We split the proof in two lemmas.

Lemma: Let t \in [0,T]. When n \to +\infty, almost surely, d( \mathbf{B}_t^n , \mathbf{B}_t) \to 0.

Proof: We first observe that, due to the Markov property of Brownian motion, we have for t_i^n \le t \le t_{i+1}^n,
\mathbb{E} \left(B_t \mid \mathcal{F}_n\right)=\mathbb{E} \left(B_t \mid B_{t_i^n}, B_{t_i^{n+1}}\right) .
It is then an easy exercise to check that
\mathbb{E} \left(B_t \mid B_{t_i^n}, B_{t_i^{n+1}}\right) = \frac{t_{i+1}^n -t}{ t_{i+1}^n-t_i^n} B_{t_i} + \frac{t-t_i^n}{ t_{i+1}^n-t_i^n} B_{t_{i+1}}=B^n_t.
As a conclusion, we get
\mathbb{E} \left(B_t \mid \mathcal{F}_n\right)=B_t^n.
It immediately follows that B_t^n \to B_t when n \to +\infty. In the same way, we have
\mathbb{E} \left( \int_0^t B^i_sdB^j_s -B^j_sdB^i_s \mid \mathcal{F}_n\right)=\int_0^t B^{n,i}_sdB^{n,j}_s -B^{n,j}_sdB^{n,i}_s.
Indeed, for 0 \le  t < T and \varepsilon small enough, we have by independence of B^i and B^j,
\mathbb{E} \left( B^i_t (B^j_{t+\varepsilon} -B^j_t) \mid \mathcal{F}_n\right)=\mathbb{E} \left( B^i_t\mid \mathcal{F}_n\right) \mathbb{E} \left( B^j_{t+\varepsilon} -B^j_t) \mid \mathcal{F}_n\right)=B^{n,i}_t (B^{n,j}_{t+\varepsilon} -B^{n,j}_t) ,
and we conclude using the fact that Ito’s integral is a limit in L^2 of Riemann sums. It follows that, almost surely,
\lim_{n \to \infty} \int_0^t B^{n,i}_sdB^{n,j}_s -B^{n,j}_sdB^{n,i}_s=\int_0^t B^i_sdB^j_s -B^j_sdB^i_s,
and we conclude that almost surely, d( \mathbf{B}_t^n , \mathbf{B}_t) \to 0 \square

The second lemma is a uniform Holder estimate for \mathbf{B}^n.

Lemma: For every \alpha \in [0,1/2), there exists a finite random variable K that belongs to L^p for every p \ge 1 and such that for every 0 \le s \le t \le T, and every n \ge 1,
d(\mathbf{B}_s^n, \mathbf{B}_t^n) \le K | t-s|^{\alpha}.

Proof: By using the theorem of equivalence of norms, we see that there is a constant C such that
d(\mathbf{B}_s^n, \mathbf{B}_t^n) \le C \left( \| B_t^n- B_s^n\| +\sum_{ i < j} \left| \int_s^t (B^{n,i}_u-B^{n,i}_s)dB^{n,j}_u -(B^{n,j}_u-B^{n,j}_s)dB^{n,i}_u\right|^{1/2}  \right).
From the Garsia-Rodemich-Rumsey inequality, we know that there is a finite random variable K_1 ( that belongs to L^p for every p \ge 1 ), such that for every 0 \le s \le t \le T,
\left| \int_s^t (B^i_u-B^i_s)dB^j_u -(B^j_u-B^j_s)dB^i_u \right| \le K_1 | t-s|^{2\alpha}.
Since
\mathbb{E} \left( \int_s^t (B^i_u-B^i_s)dB^j_u -(B^j_u-B^j_s)dB^i_u \mid \mathcal{F}_n\right)=\int_s^t (B^{n,i}_u-B^{n,i}_s)dB^{n,j}_u -(B^{n,j}_u-B^{n,j}_s)dB^{n,i}_u,
we deduce that
\left|\int_s^t (B^{n,i}_u-B^{n,i}_s)dB^{n,j}_u -(B^{n,j}_u-B^{n,j}_s)dB^{n,i}_u \right| \le K_2  | t-s|^{2\alpha},
where K_2 is a finite random variable that belongs to L^p for every p \ge 1. Similarly, of course, we have
\| B_t^n- B_s^n\| \le K_3 | t-s|^{\alpha},
and this completes the proof \square

We are now in position to finish the proof that, almost surely, d_{p-var, [0,T]}( \mathbf{B}^n , \mathbf{B}) \to 0 if 2  < p  < 3. Indeed, if t_i is a subdivision of [0,T], we have for 2 < p' < p,
\sum_{k=0}^{n-1} d\left( ( \mathbf{B}_{t_{i}}^n)^{-1} \mathbf{B}_{t_{i+1}}^n ,  ( \mathbf{B}_{t_{i}})^{-1} \mathbf{B}_{t_{i+1}}\right)^p \le d_{p'-var, [0,T]}( \mathbf{B}^n , \mathbf{B}) \left(\sup_{s,t} d\left( ( \mathbf{B}_{s}^n)^{-1} \mathbf{B}_{t}^n ,  ( \mathbf{B}_{s})^{-1} \mathbf{B}_{t}\right)\right)^{p-p'}
By using the second lemma, it is seen that d_{p'-var, [0,T]}( \mathbf{B}^n , \mathbf{B}) is bounded when n \to \infty and by combining the first two lemmas we easily see that \sup_{s,t} d\left( ( \mathbf{B}_{s}^n)^{-1} \mathbf{B}_{t}^n ,  ( \mathbf{B}_{s})^{-1} \mathbf{B}_{t}\right) \to 0.

This entry was posted in Rough paths theory. Bookmark the permalink.

2 Responses to Lecture 27. Approximation of the Brownian rough path

  1. Taras says:

    is this obvious that $K_2$ , which I believe equals $E[K_1|\mathcal F_n]$, does not depend on $n$.? Thanks!

Leave a comment