Lecture 14. The Chen expansion formula

The next few lectures will be devoted to the construction of the so-called geometric rough paths. These paths are the lifts of the p-rough paths in the free nilpotent Lie group of order p. The construction which is of algebraic and geometric nature will give a clear understanding and description of the space of rough paths. The starting point of the geometric rough path construction is the algebraic study of the signature. We present first the results for continuous paths with bounded variation because the extension to p-rough paths is more or less trivial.

Let us first remind that if x \in C^{1-var}([0,T],\mathbb{R}^d), then the signature of x is defined as the formal series
\mathfrak{S} (x)_{s,t}
=1 + \sum_{k=1}^{+\infty} \sum_{I \in \{1,...,d\}^k} \left( \int_{s \leq t_1 \leq ... \leq t_k \leq t} dx^{i_1}_{t_1}  \cdots  dx^{i_k}_{t_k} \right) X_{i_1} \cdots X_{i_k}
=1+\sum_{k=1}^{+\infty} \int_{\Delta^k[0,T]} dx^{\otimes k}.

If the indeterminates X_1,\cdots,X_d commute (that is if we work in the commutative algebra of formal series), then the signature of a path admits a very nice representation.

Indeed, let us denote by \mathcal{S}_k the group of the permutations of the index set \{1,...,k\} and if \sigma \in \mathcal{S}_k, we denote for a word I=(i_1,...,i_k), \sigma \cdot I the word (i_{\sigma(1)},...,i_{\sigma(k)}). By commuting X_1,\cdots,X_d we get
\mathfrak{S} (x)_{s,t} = 1+ \sum_{k=1}^{+\infty} \sum_{I=(i_1,...,i_k)} X_{i_1} ... X_{i_k} \left( \frac{1}{k!} \sum_{\sigma \in \mathcal{S}_k} \int_{\Delta^k [s,t]}  dx^{\sigma \cdot I} \right).
Since
\sum_{\sigma \in \mathcal{S}_k} \int_{\Delta^k [s,t]}  dx^{\sigma \cdot I} =(x^{i_1}(t)-x^{i_1}(s)) \cdots (x^{i_k}(t)-x^{i_k}(s)),
we deduce,
\mathfrak{S} (x)_t
= 1+ \sum_{k=1}^{+\infty} \frac{1}{k!} \sum_{I=(i_1,...,i_k)} X_{i_1} \cdots X_{i_k}(x^{i_1}(t)-x^{i_1}(s)) \cdots (x^{i_k}(t)-x^{i_k}(s))
=\exp \left( \sum_{i=1}^d (x^i(t)-x^i(s)) X_i \right)
where the exponential of a formal series Y is, of course, defined as
\exp (Y)=\sum_{k=0}^{+\infty} \frac{Y^k}{k!}.
As a consequence, the commutative signature of a path is simply the exponential of the increments of the path. Of course, the formula is only true in the commutative case. In the general and non-commuting case, it is remarkable that there exists a nice formula that expresses the signature as the exponential of a quite explicit series which turns out to be a Lie series (a notion defined below). We need to introduce first a few notations.

We define the Lie bracket between two elements U and V of \mathbb{R} [[ X_1 ,\cdots , X_d ]] by [U,V]=UV-VU. Moreover, if I=(i_1,...,i_k) \in \{ 1,\cdots , d \}^k is a word, we denote by X_I the iterated Lie bracket which is defined by
X_I = [X_{i_1},[X_{i_2},...,[X_{i_{k-1}}, X_{i_{k}}]...].

Theorem: If x \in C^{1-var}([0,T],\mathbb{R}^d), then
\mathfrak{S} (x)_{s,t} =\exp \left( \sum_{k \geq 1} \sum_{I \in \{1,\cdots ,d\}^k}\Lambda_I (x)_{s,t} X_I \right), \text{ } 0 \le s \le t \le T,
where for k \ge 1, \text{ }I \in \{1,\cdots ,d\}^k :

  • \mathcal{S}_k is the set of the permutations of \{1,\cdots ,k\};
  • If \sigma \in \mathcal{S}_k, e(\sigma) is the cardinality of the set
    \{ j \in \{1,\cdots ,k-1 \} , \sigma (j) > \sigma(j+1) \},
  • \Lambda_I (x)_{s,t}= \sum_{\sigma \in \mathcal{S}_k} \frac{\left(-1\right) ^{e(\sigma )}}{k^{2}\left(  \begin{array}{l}  k-1 \\  e(\sigma )  \end{array}  \right) } \int_{\Delta^k[s,t]} dx^{\sigma^{-1} \cdot I}.

The first terms in the formula are:
\sum_{I=(i_1)} \Lambda_I (x)_{s,t} X_I=\sum_{k=1}^d (x^i(t)-x^i(s)) X_i
and
\sum_{I=(i_1,i_2)} \Lambda_I (x)_{s,t} X_I=\frac{1}{2} \sum_{1 \leq i<j \leq d}  [X_i , X_j] \int_s^t x^i(u)  dx^j(u) -  x^j(u) dx^i(u).

The proof proceeds in several steps. To simplify a little the notations we will assume s=0, t=T and x(0)=0. The idea is to prove first the result when the path x is piecewise linear that is
x(t)=x(t_i)+  a_i (t-t_i)
on the interval [t_i,t_{i+1}) where 0=t_0\le t_1 \le \cdots \le  t_N =T. And, then, we will use a limiting argument.

The key point here is the multiplicativity property for the signature that already was pointed out in a previous lecture: For 0\le s \le t \le u \le T,
\mathfrak{S} (x)_{s,u}=\mathfrak{S} (x)_{s,t}\mathfrak{S} (x)_{t,u}.
By using inductively the multiplicative property, we obtain
\mathfrak{S} (x)_{0,T}=\prod_{n=0}^{N-1} \left( \mathbf{1} + \sum_{k=1}^{+\infty}  \sum_{I=(i_1,...i_k)} X_{i_1} ... X_{i_k} \int_{\Delta^k [t_n,t_{n+1}]} dx^I \right)

Since, on [t_n,t_{n+1}),
dx(t)=a_n dt,
we have
\int_{\Delta^k [t_n,t_{n+1}]}  dx^I =a_n^{i_1} \cdots a_n^{i_k} \int_{\Delta^k [t_n,t_{n+1}]}  dt_{i_1} \cdots dt_{i_k} =a_n^{i_1} \cdots a_n^{i_k} \frac{(t_{n+1}-t_n)^k}{k!}.
Therefore
\mathfrak{S} (x)_{0,T}
=\prod_{n=0}^{N-1} \left( \mathbf{1} + \sum_{k=1}^{+\infty} \sum_{I=(i_1,...i_k)} X_{i_1} ... X_{i_k} a_n^{i_1} \cdots a_n^{i_k} \frac{(t_{n+1}-t_n)^k}{k!} \right)
=\prod_{n=0}^{N-1} \exp \left( (t_{n+1}-t_n) \sum_{i=0}^d a_n^i X_i \right).

We now use the Baker-Campbell-Hausdorff-Dynkin formula that gives a quite explicit formula for the product of exponentials of non commuting variables:

Proposition: If y_1,\cdots,y_N \in \mathbb{R}^{d} then,
\prod_{n=1}^{N}\exp \left( \sum_{i=1}^d y_n^i X_i \right)  =\exp \left( \sum_{k \geq 1} \sum_{I \in \{1,...,d\}^k}\beta_I (y_1,\cdots,y_N) X_I \right),
where for k \ge 1, \text{ }I \in \{1,...,d\}^k :
\beta_I  (y_1,\cdots,y_N) =\sum_{\sigma \in \mathcal{S}_k} \sum_{1=j_0 \le j_1 \le \cdots \le j_{N-1} \le k} \frac{\left(-1\right) ^{e(\sigma )}}{j_1!\cdots j_{N-1}! k^{2}\left(  \begin{array}{l}  k-1 \\  e(\sigma )  \end{array}  \right) } \prod_{\nu=1}^{N}  y_\nu^{\sigma^{-1}(i_{j_{\nu-1}+1})} \cdots y_\nu^{\sigma^{-1}(i_{j_\nu})}.

We get therefore:
\mathfrak{S} (x)_{0,T}=\exp \left( \sum_{k \geq 1} \sum_{I \in \{1,...,d\}^k}\beta_I (t_1 a_0,\cdots,(t_N-t_{N-1})a_{N-1}) X_I \right).
It is finally an exercise to check, by using the Chen relations, that:
\beta_I (t_1 a_0,\cdots,(t_N-t_{N-1})a_{N-1})= \sum_{\sigma \in \mathcal{S}_k} \frac{\left(-1\right) ^{e(\sigma )}}{k^{2}\left(  \begin{array}{l}  k-1 \\  e(\sigma )  \end{array}  \right) }  \int_{\Delta^k[0,T]} dx^{\sigma^{-1} \cdot I}.

We conclude that if x is piecewise linear then the formula
\mathfrak{S} (x)_{s,t} =\exp \left( \sum_{k \geq 1} \sum_{I \in \{1,\cdots ,d\}^k}\Lambda_I (x)_{s,t} X_I \right), \text{ } 0 \le s \le t \le T

holds. Finally, if x \in C^{1-var}([0,T],\mathbb{R}^d), then we can consider the sequence x_n of linear interpolations along a subdivision of [0,T] whose mesh goes to 0. For this sequence, all the iterated integrals \int_{\Delta^k[0,T]} dx_n^{ I} will converge to \int_{\Delta^k[0,T]} dx^{ I} (see for instance the proposition 2.7 in the book by Friz-Victoir) and the result follows.

This entry was posted in Rough paths theory. Bookmark the permalink.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s