Lecture 8. Rough paths Fall 2017

In this lecture, it is now time to harvest the fruits of the two previous lectures. This will allow us to finally define the notion of p-rough path and to construct the signature of such path.

A first result which is a consequence of the theorem proved in the previous lecture is the following continuity of the iterated iterated integrals with respect to a convenient topology. The proof uses very similar arguments to the previous two lectures, so we let it as an exercise to the student.

Theorem: Let p \ge 1, K > 0 and x,y \in C^{1-var}([0,T],\mathbb{R}^d) such that
\sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \le 1,
and
\left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^p+ \left( \sum_{j=1}^{[p]} \left\| \int dy^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^p \le K.
Then there exists a constant C \ge 0 depending only on p and K such that for k \ge 1
\left\| \int_{\Delta^k [0,\cdot]} dx^{\otimes k}- \int_{\Delta^k [0,\cdot]} dy^{\otimes k} \right\|_{p-var, [0,T]} \le \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \frac{C^k}{\left( \frac{k}{p}\right)!}.

This continuity result naturally leads to the following definition.

Definition: Let p \ge 1 and x \in C^{p-var}([0,T],\mathbb{R}^d). We say that x is a p-rough path if there exists a sequence x_n \in C^{1-var}([0,T],\mathbb{R}^d) such that x_n\to x in p-variation and such that for every \varepsilon > 0, there exists N \ge 0 such that for m,n \ge N,
\sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \le \varepsilon.
The space of p-rough paths will be denoted \mathbf{\Omega}^p([0,T],\mathbb{R}^d).

From the very definition, \mathbf{\Omega}^p([0,T],\mathbb{R}^d) is the closure of C^{1-var}([0,T],\mathbb{R}^d) inside C^{p-var}([0,T],\mathbb{R}^d) for the distance
d_{\mathbf{\Omega}^p([0,T],\mathbb{R}^d)}(x,y)= \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} .

If x \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d) and x_n \in C^{1-var}([0,T],\mathbb{R}^d) is such that x_n\to x in p-variation and such that for every \varepsilon > 0, there exists N \ge 0 such that for m,n \ge N,
\sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \le \varepsilon,
then we define \int_{\Delta^k [s,t]} dx^{\otimes k} for k \le p as the limit of the iterated integrals \int_{\Delta^k [s,t]} dx_n^{\otimes k}. However it is important to observe that \int_{\Delta^k [s,t]} dx^{\otimes k} may then depend on the choice of the approximating sequence x_n. Once the integrals \int_{\Delta^k [s,t]} dx^{\otimes k} are defined for k \le p, we can then use the previous theorem to construct all the iterated integrals \int_{\Delta^k [s,t]} dx^{\otimes k} for k > p. It is then obvious that if x,y \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d), then
1 + \sum_{k=1}^{[p]} \int_{\Delta^k [s,t]} dx^{\otimes k}=1 + \sum_{k=1}^{[p]} \int_{\Delta^k [s,t]} dy^{\otimes k}
implies that
1 + \sum_{k=1}^{+\infty } \int_{\Delta^k [s,t]} dx^{\otimes k}=1 + \sum_{k=1}^{+\infty} \int_{\Delta^k [s,t]} dy^{\otimes k}.
In other words the signature of a p-rough path is completely determinated by its truncated signature at order [p]:
\mathfrak{S}_{[p]} (x)_{s,t} =1 + \sum_{k=1}^{[p]} \int_{\Delta^k [s,t]} dx^{\otimes k}.
For this reason, it is natural to present a p-rough path by this truncated signature at order [p] in order to stress that the choice of the approximating sequence to contruct the iterated integrals up to order [p] has been made. This will be further explained in much more details when we will introduce the notion of geometric rough path over a rough path.

The following results are straightforward to obtain from the previous lectures by a limiting argument.

Lemma: Let x \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d), p \ge 1. For 0 \le s \le t \le u \le T , and n \ge 1,
\int_{\Delta^n [s,u]} dx^{\otimes n}=\sum_{k=0}^{n} \int_{\Delta^k [s,t]} dx^{\otimes k }\int_{\Delta^{n-k} [t,u]} dx^{\otimes (n-k) }.

Theorem: Let p \ge 1. There exists a constant C \ge 0, depending only on p, such that for every x \in\mathbf{\Omega}^p([0,T],\mathbb{R}^d) and k \ge 1,
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\| \le \frac{C^k}{\left( \frac{k}{p}\right)!} \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^k, \quad 0 \le s \le t \le T.

If p \ge 2, the space \mathbf{\Omega}^p([0,T],\mathbb{R}^d) is not a priori a Banach space (it is not a linear space) but it is a complete metric space for the distance
d_{\mathbf{\Omega}^p([0,T],\mathbb{R}^d)}(x,y)= \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} .
The structure of \mathbf{\Omega}^p([0,T],\mathbb{R}^d) will be better understood in the next lectures, but let us remind that if 1 \le p < 2, then \mathbf{\Omega}^p([0,T],\mathbb{R}^d) is the closure of C^{1-var}([0,T],\mathbb{R}^d) inside C^{p-var}([0,T],\mathbb{R}^d) for the variation distance it is therefore what we denoted C^{0,p-var}([0,T],\mathbb{R}^d). As a corollary we deduce

Proposition: Let 1 \le p < 2. Then x \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d) if and only if
\lim_{\delta \to 0} \sup_{ \Pi \in \mathcal{D}[s,t], | \Pi | \le \delta } \sum_{k=0}^{n-1} \| x(t_{k+1}) -x(t_k) \|^p=0,
where \mathcal{D}[s,t] is the set of subdivisions of [s,t]. In particular, for p < q < 2,
C^{q-var}([0,T],\mathbb{R}^d) \subset \mathbf{\Omega}^p([0,T],\mathbb{R}^d).

 

We are now ready to define solutions of linear differential equations driven by p-rough paths, p \ge 1 and present the Lyons’ continuity theorem in this setting. Let x \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d) be a p-rough path with truncated signature \sum_{k=0}^{[p]} \int_{\Delta^k [s,t]} dx^{\otimes k}, and let x_n \in C^{1-var}([0,T],\mathbb{R}^d) be an approximating sequence such that
\sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dx_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \to 0.

Let us consider matrices M_1,\cdots,M_d \in \mathbb{R}^{n \times n}. We have the following theorem:

Theorem: Let y_n:[0,T] \to \mathbb{R}^n be the solution of the differential equation
y_n(t)=y(0)+\sum_{i=1}^d \int_0^t M_i y_n(s)d x^i_n(s).
Then, when n \to \infty, y_n converges in the p-variation distance to some y \in C^{p-var}([0,T],\mathbb{R}^n) . y is called the solution of the rough differential equation
y(t)=y(0)+\sum_{i=1}^d \int_0^t M_i y(s)d x^i(s).

Proof: It is a classical result that the solution of the equation
y_n(t)=y(0)+\sum_{i=1}^d \int_0^t M_i y_n(s)d x^i_n(s),
can be expanded as the convergent Volterra series:
y_n(t)=y_n(s)+ \sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[s,t]}dx_n^{I} \right) y_n(s).
Therefore, in particular, for n,m \ge 0,
y_n(t)-y_p(t)=\sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[0,t]}dx_n^{I}- \int_{\Delta^{k}[0,t]}dx_p^{I} \right) y(0),
which implies that
\| y_n(t)-y_m(t) \| \le \sum^{+\infty}_{k=1}M^k \left\| \int_{\Delta^{k}[0,t]}dx_n^{\otimes k}- \int_{\Delta^{k}[0,t]}dx_m^{\otimes k} \right\| \| y(0) \|
with M=\max \{ \| M_1 \| , \cdots , \| M_d \| \}. From the theorems of the previous lectures, there exists a constant C \ge 0 depending only on p and
\sup_n \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]}
such that for k \ge 1 and n,m big enough:
\left\| \int_{\Delta^k [0,\cdot]} dx_n^{\otimes k}- \int_{\Delta^k [0,\cdot]} dx_m^{\otimes k} \right\|_{p-var, [0,T]} \le \left( \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \frac{C^k}{\left( \frac{k}{p}\right)!}.
As a consequence, there exists a constant \tilde{C} such that for n,m big enough:
\| y_n(t)-y_m(t) \| \le \tilde{C} \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} .
This already proves that y_n converges in the supremum topology to some y. We now have
(y_n(t)-y_n(s))-(y_m(t)-y_m(s))
=\sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[s,t]}dx_n^{I}y_n(s) -\int_{\Delta^{k}[s,t]}dx_m^{I} y_m(s)\right),
and we can bound
\left\| \int_{\Delta^{k}[s,t]}dx_n^{I}y_n(s) -\int_{\Delta^{k}[s,t]}dx_m^{I} y_m(s) \right\|
\le \left\| \int_{\Delta^{k}[s,t]}dx_n^{I} \right\| \| y_n(s)-y_m(s) \|+\| y_m(s) \| \left\| \int_{\Delta^{k}[s,t]}dx_n^{I} - \int_{\Delta^{k}[s,t]}dx_m^{I}\right\|
\le \left\| \int_{\Delta^{k}[s,t]}dx_n^{I} \right\| \| y_n-y_m \|_{\infty, [0,T]} +\| y_m \|_{\infty, [0,T]} \left\| \int_{\Delta^{k}[s,t]}dx_n^{I} - \int_{\Delta^{k}[s,t]}dx_m^{I}\right\|
Again, from the theorems of the previous lectures, there exists a constant C \ge 0, depending only on p and
\sup_n \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]}
such that for k \ge 1 and n,m big enough
\left\| \int_{\Delta^k [s,t]} dx_n^{\otimes k} \right\| \le \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{k/p}, \quad 0 \le s \le t \le T.
\left\| \int_{\Delta^k [s,t]} dx_n^{\otimes k}- \int_{\Delta^k [s,t]} dx_m^{\otimes k} \right\| \le \left( \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes k} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{k/p} ,
where \omega is a control such that \omega(0,T)=1. Consequently, there is a constant \tilde{C}, such that
\| (y_n(t)-y_n(s))-(y_m(t)-y_m(s)) \|
\le \tilde{C} \left( \| y_n-y_m \|_{\infty, [0,T]} + \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes k} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \omega(s,t)^{1/p}
This implies the estimate
\| y_n -y_m \|_{p-var,[0,T]} \le \tilde{C} \left( \| y_n-y_m \|_{\infty, [0,T]} + \sum_{j=1}^{[p]} \left\| \int dx_n^{\otimes j}- \int dx_m^{\otimes k} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)
and thus gives the conclusion \square

With just a little more work, it is possible to prove the following stronger result whose proof is let to the reader.
Theorem: Let y_n:[0,T] \to \mathbb{R}^n be the solution of the differential equation
y_n(t)=y(0)+\sum_{i=1}^d \int_0^t M_i y_n(s)d x^i_n(s).
and y be the solution of the rough differential equation:
y(t)=y(0)+\sum_{i=1}^d \int_0^t M_i y(s)d x^i(s).
Then, y \in \mathbf{\Omega}^p([0,T],\mathbb{R}^d) and when n \to \infty,
\sum_{j=1}^{[p]} \left\| \int dy^{\otimes j}- \int dy_n^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \to 0.

We can get useful estimates for solutions of rough differential equations. For that, we need the following analysis lemma:

Proposition: For x \ge 0 and p \ge 1,
\sum_{k=0}^{+\infty} \frac{x^k}{\left( \frac{k}{p} \right)!} \le p e^{x^p}.

Proof: For \alpha \ge 0, we denote
E_\alpha(x)=\sum_{k=0}^{+\infty} \frac{x^k}{\left( k \alpha \right)!}.
This is a special function called the Mittag-Leffler function. From the binomial inequality
E_\alpha(x)^2
=\sum_{k=0}^{+\infty} \left( \sum_{j=0}^k \frac{1}{\left( j \alpha \right)!\left( (k-j) \alpha \right)!}\right)x^k
\le \frac{1}{\alpha}\sum_{k=0}^{+\infty} 2^{\alpha k} \frac{x^k}{\left( k \alpha \right)!}=\frac{1}{\alpha}E_\alpha(2^\alpha x).
Thus we proved
E_\alpha(x)\le\frac{1}{\alpha^{1/2}}E_\alpha(2^\alpha x)^{1/2}.
Iterating this inequality, k times we obtain
E_\alpha(x)\le \frac{1}{\alpha^{\sum_{j=1}^k \frac{1}{2^j}}} E_\alpha(2^{\alpha k}x)^{1/(2k)}.
It is known (and not difficult to prove) that
E_\alpha(x) \sim_{x \to \infty} \frac{1}{\alpha} e^{x^{1/\alpha}}.
By letting k \to \infty we conclude
E_\alpha(x) \le \frac{1}{\alpha} e^{x^{1/\alpha}}.
\square

This estimate provides the following result:

Proposition: Let y be the solution of the rough differential equation:
y(t)=y(0)+\sum_{i=1}^d \int_0^t M_i y(s)d x^i(s).
Then, there exists a constant C depending only on p such that for 0 \le t \le T,
\| y(t) \| \le p \| y(0)\| e^{ CM \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,t]} \right)^p},
where M=\max \{ \| M_1 \|, \cdots, \|M_d\| \}.

Proof: We have
y(t)=y(0)+ \sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M_{i_1}\cdots M_{i_k} \left( \int_{\Delta^{k}[0,t]}dx^{I} \right) y(0).
Thus we obtain
y(t)\le \left( 1+ \sum^{+\infty}_{k=1}\sum_{I=(i_1,\cdots,i_k)} M^k \left\| \int_{\Delta^{k}[0,t]}dx^{I} \right\| \right) \| y(0) \|,
and we conclude by using estimates on iterated integrals of rough paths together with the previous lemma \square

Posted in Rough paths theory | Leave a comment

HW6. MA3160 Fall 2017

 

Exercise 1. Patricia receives  an average of two texts every 2 hours. If we assume that the number of texts is Poisson distributed, what is the probability that she receives five or more texts in a 9 hours period?

Exercise 2.  A UConn student claims that she can distinguish Dairy Bar ice cream from Friendly’s ice cream. As a test, she is given ten samples of ice cream (each sample is either from the Dairy Bar or Friendly’s) and asked to identify each one. She is right eight times. What is the probability that she would be right exactly eight times if she guessed randomly for each sample?

Posted in MA3160 | Leave a comment

HW5. MA3160 Fall 2017

Exercise 1. Three balls are randomly chosen with replacement from an urn containing 5 blue, 4 red, and 2 yellow balls. Let X denote the number of red balls chosen.

(a) What are the possible values of X?
(b) What are the probabilities associated to each value?

 

Exercise 2. Suppose X is a random variable such that E[X] = 50 and Var(X) = 12. Calculate the following quantities.

(a) E[X^2]
(b) E [3X + 2]

(c) E [(X+2)^2]

(d) Var[−X]

Posted in Uncategorized | Leave a comment

Annales de la faculte des sciences de Toulouse

Annales de la Faculte des Sciences de Toulouse is a peer-reviewed international  journal with a long tradition of excellence (going back to 1887 and Thomas Stieltjes). The journal periodically publishes surveys by the recipients of the Fermat Prize.  The Editorial Board encourages high level submissions.

Submissions in all areas of mathematics are accepted and decisions are usually made within 3 months. The electronic version is free and accessible without subscription.

Posted in Mathematicians | Leave a comment

MA3160. Fall 2017. Midterm 1 sample

Practice midterm 1

 

We will do the correction in class on 09/28.

Posted in Uncategorized | Leave a comment

HW4. MA360 Fall 2017

Exercise 1. Two dice are rolled. Consider the events A = {sum of two dice equals 3}, B = {sum of two dice equals 7 }, and C = {at least one of the dice shows a 1}.

(a) What is P(A | C)?

(b) What is P(B | C)?

(c) Are A and C independent? What about B and C?

 

Exercise 2. Suppose you roll two standard, fair, 6-sided dice. What is the probability that the sum is at least 9 given that you rolled at least one 6?

Exercise 3.  Color blindness is a sex-linked condition, and 5% of men and 0.25% of women are color blind. The population of the United States is 51% female. What is the probability that a color-blind American is a man?

 

Posted in MA3160 | Leave a comment

Lecture 7. Rough paths. Fall 2017

In the previous lecture we introduced the signature of a bounded variation path x as the formal series
\mathfrak{S} (x)_{s,t} =1 + \sum_{k=1}^{+\infty} \int_{\Delta^k [s,t]} dx^{\otimes k}.
If now x \in C^{p-var}([0,T],\mathbb{R}^d), p \ge 1 the iterated integrals \int_{\Delta^k [s,t]} dx^{\otimes k} can only be defined as Young integrals when p < 2. In this lecture, we are going to derive some estimates that allow to define the signature of some (not all) paths with a finite p variation when p \ge 2. These estimates are due to Terry Lyons in his seminal paper and this is where the rough paths theory really begins.

For P \in \mathbb{R} [[X_1,...,X_d]] that can be writen as
P=P_0+\sum_{k = 1}^{+\infty} \sum_{I \in \{1,...,d\}^k}a_{i_1,...,i_k} X_{i_1}...X_{i_k},
we define
\| P \| =|P_0|+\sum_{k = 1}^{+\infty} \sum_{I \in \{1,...,d\}^k}|a_{i_1,...,i_k}| \in [0,\infty].
It is quite easy to check that for P,Q \in \mathbb{R} [[X_1,...,X_d]]
\| PQ \| \le \| P \| \| Q\|.
Let x \in C^{1-var}([0,T],\mathbb{R}^d). For p \ge 1, we denote
\left\| \int dx^{\otimes k}\right\|_{p-var, [s,t]}=\left( \sup_{ \Pi \in \mathcal{D}[s,t]} \sum_{i=0}^{n-1} \left\| \int_{\Delta^k [t_i,t_{i+1}]} dx^{\otimes k} \right\|^p \right)^{1/p},
where \mathcal{D}[s,t] is the set of subdivisions of the interval [s,t]. Observe that for k \ge 2, in general
\int_{\Delta^k [s,t]} dx^{\otimes k}+ \int_{\Delta^k [t,u]} dx^{\otimes k} \neq \int_{\Delta^k [s,u]} dx^{\otimes k}.
Actually from the Chen’s relations we have
\int_{\Delta^n [s,u]} dx^{\otimes n}= \int_{\Delta^n [s,t]} dx^{\otimes k}+ \int_{\Delta^n [t,u]} dx^{\otimes k} +\sum_{k=1}^{n-1} \int_{\Delta^k [s,t]} dx^{\otimes k }\int_{\Delta^{n-k} [t,u]} dx^{\otimes (n-k) }.
It follows that \left\| \int dx^{\otimes k}\right\|_{p-var, [s,t]} needs not to be the p-variation of t \to \int_{\Delta^k [s,t]} dx^{\otimes k}.
The first major result of rough paths theory is the following estimate:

Proposition: Let p \ge 1. There exists a constant C \ge 0, depending only on p, such that for every x \in C^{1-var}([0,T],\mathbb{R}^d) and k \ge 0,
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\| \le \frac{C^k}{\left( \frac{k}{p}\right)!} \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^k, \quad 0 \le s \le t \le T.

By \left( \frac{k}{p}\right)!, we of course mean \Gamma \left( \frac{k}{p}+1\right). Some remarks are in order before we prove the result. If p=1, then the estimate becomes
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\| \le \frac{C^k}{k!} \| x \|_{1-var, [s,t]}^k,
which is immediately checked because
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\|
\le \sum_{I \in \{1,...,d\}^k} \left\| \int_{\Delta^{k}[s,t]}dx^{I} \right\|
\le \sum_{I \in \{1,...,d\}^k} \int_{s \le t_1 \le t_2 \le \cdots \le t_k \le t} \| dx^{i_1}(t_1) \| \cdots \| dx^{i_k}(t_k)\|
\le \frac{1}{k!} \left( \sum_{j=1}^ d \| x^j \|_{1-var, [s,t]} \right)^k.

We can also observe that for k \le p, the estimate is easy to obtain because
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\| \le \left\| \int dx^{\otimes k}\right\|_{\frac{p}{k}-var, [s,t]}.
So, all the work is to prove the estimate when k >p. The proof is split into two lemmas. The first one is a binomial inequality which is actually quite difficult to prove:

Lemma: For x,y >0, n \in \mathbb{N}, n \ge 0, and p \ge 1,
\sum_{j=0}^n \frac{x^{j/p}}{\left( \frac{j}{p}\right)!} \frac{y^{(n-j)/p}}{\left( \frac{n-j}{p}\right)!} \le p \frac{(x+y)^{n/p}}{ {\left( \frac{n}{p}\right)!}}.

Proof: See Lemma 2.2.2 in the article by Lyons or this proof for the sharp constant \square

The second one is a lemma that actually already was essentially proved in the Lecture on Young’s integral, but which was not explicitly stated.

Lemma: Let \Gamma: \{ 0 \le s \le t \le T \} \to \mathbb{R}^N. Let us assume that:

  • There exists a control \tilde{\omega} such that
    \lim_{r \to 0} \sup_{(s,t), \tilde{\omega}(s,t) \le r } \frac{\| \Gamma_{s,t} \|}{r}=0;
  • There exists a control \omega and \theta >1, \xi >0 such that for 0 \le s \le t \le u \le T,
    \| \Gamma_{s,u} \| \le \| \Gamma_{s,t} \|+ \| \Gamma_{t,u} \| +\xi \omega(s,u)^\theta.

Then, for all 0 \le s \le t \le T,
\| \Gamma_{s,t} \| \le \frac{\xi}{1-2^{1-\theta}} \omega(s,t)^\theta.

Proof:
See the proof of the Young-Loeve estimate or Lemma 6.2 in the book by Friz-Victoir \square

We can now turn to the proof of the main result.

Proof:
Let us denote
\omega(s,t)=\left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^p.
We claim that \omega is a control. Indeed for 0 \le s \le t \le u \le T, we have from Holder’s inequality
\omega(s,t)+\omega(t,u)
= \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^p+\left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [t,u]} \right)^p
\le \left( \sum_{j=1}^{[p]}\left( \left\| \int dx^{\otimes j}\right\|^{p/j}_{\frac{p}{j}-var, [s,t]} + \left\| \int dx^{\otimes j}\right\|^{p/j}_{\frac{p}{j}-var, [t,u]}\right)^{1/p} \right)^p
\le \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,u]} \right)^p =\omega(s,u).

It is clear that for some constant \beta > 0 which is small enough, we have for k \le p,
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\| \le \frac{1}{\beta \left( \frac{k}{p}\right)!} \omega(s,t)^{k/p}.

Let us now consider
\Gamma_{s,t}= \int_{\Delta^{[p]+1} [s,t]} dx^{\otimes ([p]+1)}.
From the Chen’s relations, for 0 \le s \le t \le u \le T,
\Gamma_{s,u}= \Gamma_{s,t}+ \Gamma_{t,u}+\sum_{j=1}^{[p]} \int_{\Delta^j [s,t]} dx^{\otimes j }\int_{\Delta^{[p]+1-j} [t,u]} dx^{\otimes ([p]+1-j) }.
Therefore,
\| \Gamma_{s,u}\|
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\sum_{j=1}^{[p]} \left\| \int_{\Delta^j [s,t]} dx^{\otimes j }\right\| \left\| \int_{\Delta^{[p]+1-j} [t,u]} dx^{\otimes ([p]+1-j) }\right\|
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\frac{1}{\beta^2} \sum_{j=1}^{[p]} \frac{1}{ \left( \frac{j}{p}\right)!} \omega(s,t)^{j/p}\frac{1}{ \left( \frac{[p]+1-j}{p}\right)!} \omega(t,u)^{([p]+1-j)/p}
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\frac{1}{\beta^2} \sum_{j=0}^{[p]+1} \frac{1}{ \left( \frac{j}{p}\right)!} \omega(s,t)^{j/p}\frac{1}{ \left( \frac{[p]+1-j}{p}\right)!} \omega(t,u)^{([p]+1-j)/p}
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\frac{1}{\beta^2} p \frac{(\omega(s,t)+\omega(t,u))^{([p]+1)/p}}{ {\left( \frac{[p]+1}{p}\right)!}}
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\frac{1}{\beta^2} p \frac{\omega(s,u)^{([p]+1)/p}}{ {\left( \frac{[p]+1}{p}\right)!}}.
On the other hand, we have
\| \Gamma_{s,t} \| \le A \| x \|_{1-var,[s,t]}^{[p]+1}.
We deduce from the previous lemma that
\| \Gamma_{s,t} \| \le \frac{1}{\beta^2} \frac{p}{1-2^{1-\theta}} \frac{\omega(s,t)^{([p]+1)/p}}{ {\left( \frac{[p]+1}{p}\right)!}},
with \theta=\frac{[p]+1}{p}. The general case k \ge p is dealt by induction. The details are let to the reader \square

 

Let x \in C^{1-var}([0,T],\mathbb{R}^d). Since
\omega(s,t)=\left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^p
is a control, the estimate
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k} \right\| \le \frac{C^k}{\left( \frac{k}{p}\right)!} \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^k, \quad 0 \le s \le t \le T.
easily implies that for k > p,
\left\| \int dx^{\otimes k} \right\|_{1-var, [s,t]} \le \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{k/p}.
We stress that it does not imply a bound on the 1-variation of the path t \to \int_{\Delta^k [0,t]} dx^{\otimes k} . What we can get for this path, are bounds in p-variation:

Proposition: Let p \ge 1. There exists a constant C \ge 0, depending only on p, such that for every x \in C^{1-var}([0,T],\mathbb{R}^d) and k \ge 0,
\left\| \int_{\Delta^k [0,\cdot]} dx^{\otimes k} \right\|_{p-var, [s,t]} \le \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{1/p} \omega(0,T)^{\frac{k-1}{p}}
where
\omega(s,t)= \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^p, \quad 0 \le s \le t \le T.

Proof: This is an easy consequence of the Chen’s relations. Indeed,

\left\| \int_{\Delta^k [0,t]} dx^{\otimes k} - \int_{\Delta^k [0,s]} dx^{\otimes k} \right\|
=\left\| \sum_{j=1}^k \int_{\Delta^j [s,t]} dx^{\otimes j} \int_{\Delta^{j-k} [0,s]} dx^{\otimes (k-j)} \right\|
\le \sum_{j=1}^k \left\| \int_{\Delta^j [s,t]} dx^{\otimes j} \right\| \left\| \int_{\Delta^{j-k} [0,s]} dx^{\otimes (k-j)} \right\|
\le C^k \sum_{j=1}^k \frac{1}{\left( \frac{j}{p}\right)!} \omega(s,t)^{j/p} \frac{1}{\left( \frac{k-j}{p}\right)!} \omega(s,t)^{(k-j)/p}
\le C^k \omega(s,t)^{1/p} \sum_{j=1}^k \frac{1}{\left( \frac{j}{p}\right)!} \omega(0,T)^{(j-1)/p} \frac{1}{\left( \frac{k-j}{p}\right)!} \omega(0,T)^{(k-j)/p}
\le C^k \omega(s,t)^{1/p} \omega(0,T)^{(k-1)/p}\sum_{j=1}^k \frac{1}{\left( \frac{j}{p}\right)!} \frac{1}{\left( \frac{k-j}{p}\right)!}.
and we conclude with the binomial inequality \square

We are now ready for a second major estimate which is the key to define iterated integrals of a path with p-bounded variation when p \ge 2.

Theorem: Let p \ge 1, K > 0 and x,y \in C^{1-var}([0,T],\mathbb{R}^d) such that
\sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \le 1,
and
\left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^p+ \left( \sum_{j=1}^{[p]} \left\| \int dy^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^p \le K.
Then there exists a constant C \ge 0 depending only on p and K such that for 0\le s \le t \le T and k \ge 1
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k}- \int_{\Delta^k [s,t]} dy^{\otimes k} \right\| \le \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{k/p} ,
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k}\right\| +\left\| \int_{\Delta^k [s,t]} dy^{\otimes k} \right\| \le \frac{C^k}{\left( \frac{k}{p}\right)!} \omega(s,t)^{k/p}
where \omega is the control
\omega(s,t)= \frac{ \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^p+ \left( \sum_{j=1}^{[p]} \left\| \int dy^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} \right)^p } { \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^p+ \left( \sum_{j=1}^{[p]} \left\| \int dy^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^p }
+\left( \frac{\sum_{j=1}^{[p]} \left\| \int dx^{\otimes j} - \int dy^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [s,t]} }{\sum_{j=1}^{[p]} \left\| \int dx^{\otimes j} - \int dy^{\otimes j}\right\|^{1/j}_{\frac{p}{j}-var, [0,T]} } \right)^p

Proof: We prove by induction on k that for some constants C,\beta,
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k}- \int_{\Delta^k [s,t]} dy^{\otimes k} \right\| \le \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \frac{C^k}{\beta \left( \frac{k}{p}\right)!} \omega(s,t)^{k/p},
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k}\right\| +\left\| \int_{\Delta^k [s,t]} dy^{\otimes k} \right\| \le \frac{C^k}{\beta \left( \frac{k}{p}\right)!} \omega(s,t)^{k/p}

For k \le p, we trivially have
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k}- \int_{\Delta^k [s,t]} dy^{\otimes k} \right\| \le \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right)^k \omega(s,t)^{k/p}
\le \left( \sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} \right) \omega(s,t)^{k/p}.
and
\left\| \int_{\Delta^k [s,t]} dx^{\otimes k}\right\| +\left\| \int_{\Delta^k [s,t]} dy^{\otimes k} \right\| \le K^{k/p} \omega(s,t)^{k/p}.
Not let us assume that the result is true for 0 \le j \le k with k > p. Let
\Gamma_{s,t}=\int_{\Delta^k [s,t]} dx^{\otimes (k+1)}- \int_{\Delta^k [s,t]} dy^{\otimes (k+1)}
From the Chen’s relations, for 0 \le s \le t \le u \le T,
\Gamma_{s,u}= \Gamma_{s,t}+ \Gamma_{t,u}
+\sum_{j=1}^{k} \int_{\Delta^j [s,t]} dx^{\otimes j }\int_{\Delta^{k+1-j} [t,u]} dx^{\otimes (k+1-j) }-\sum_{j=1}^{k} \int_{\Delta^j [s,t]} dy^{\otimes j }\int_{\Delta^{k+1-j} [t,u]} dy^{\otimes (k+1-j) }.
Therefore, from the binomial inequality
\| \Gamma_{s,u}\|
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\sum_{j=1}^{k} \left\| \int_{\Delta^j [s,t]} dx^{\otimes j }- \int_{\Delta^j [s,t]} dy^{\otimes j } \right\| \left\| \int_{\Delta^{k+1-j} [t,u]} dx^{\otimes (k+1-j) }\right\|
+\sum_{j=1}^{k} \left\| \int_{\Delta^{j} [s,t]} dy^{\otimes j }\right\| \left\| \int_{\Delta^{k+1-j} [t,u]} dx^{\otimes (k+1-j) }- \int_{\Delta^{k+1-j} [t,u]} dy^{\otimes (k+1-j) } \right\|
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\frac{1}{\beta^2}\tilde{\omega}(0,T) \sum_{j=1}^{k} \frac{C^j}{\left( \frac{j}{p}\right)!} \omega(s,t)^{j/p} \frac{C^{k+1-j}}{\left( \frac{k+1-j}{p}\right)!} \omega(t,u)^{(k+1-j)/p}
+\frac{1}{\beta^2}\tilde{\omega}(0,T) \sum_{j=1}^{k} \frac{C^j}{\left( \frac{j}{p}\right)!} \omega(s,t)^{j/p} \frac{C^{k+1-j}}{\left( \frac{k+1-j}{p}\right)!} \omega(t,u)^{(k+1-j)/p}
\le \| \Gamma_{s,t} \| + \| \Gamma_{t,u} \| +\frac{2p}{\beta^2} \tilde{\omega}(0,T) C^{k+1} \frac{ \omega(s,u)^{(k+1)/p}}{\left( \frac{k+1}{p}\right)! }
where
\tilde{\omega}(0,T)=\sum_{j=1}^{[p]} \left\| \int dx^{\otimes j}- \int dy^{\otimes j} \right\|^{1/j}_{\frac{p}{j}-var, [0,T]} .
We deduce
\| \Gamma_{s,t} \| \le \frac{2p}{\beta^2(1-2^{1-\theta})} \tilde{\omega}(0,T) C^{k+1} \frac{ \omega(s,t)^{(k+1)/p}}{\left( \frac{k+1}{p}\right)! }
with \theta= \frac{k+1}{p}. A correct choice of \beta finishes the induction argument \square

Posted in Rough paths theory | 3 Comments