## Lecture 2. Continuous paths with bounded variation

The first few lectures are essentially reminders of undergraduate real analysis materials. We will cover some aspects of the theory of differential equations driven by continuous paths with bounded variation. The point is to fix some notations that will be used throughout the course and to stress the importance of the topology of convergence in 1-variation if we are interested in stability results for solutions with respect to the driving signal.

If $s \le t$, we will denote by $\Delta [s,t]$, the set of subdivisions of the interval $[s,t]$, that is $\Pi \in \Delta [s,t]$ can be written
$\Pi=\left\{ s= t_0 < t_1 < \cdots < t_n =t \right\}.$

Definition: A continuous path $x : [s,t] \to \mathbb{R}^d$ is said to have a bounded variation on $[s,t]$, if the 1-variation of $x$ on $[s,t]$, which is defined as
$\| x \|_{1-var; [s,t]} :=\sup_{ \Pi \in \Delta[s,t]} \sum_{k=0}^{n-1} \| x(t_{k+1}) -x(t_k) \|,$
is finite. The space of continuous bounded variation paths $x : [s,t] \to \mathbb{R}^d$, will be denoted by $C^{1-var} ([s,t], \mathbb{R}^d)$.

$\| \cdot \|_{1-var; [s,t]}$ is not a norm, because constant functions have a zero 1-variation, but it is obviously a semi-norm. If $x$ is continuously differentiable on $[s,t]$, it is easily seen that
$\| x \|_{1-var, [s,t]}=\int_s^t \| x'(s) \| ds.$

Proposition: Let $x \in C^{1-var} ([0,T], \mathbb{R}^d)$. The function $(s,t)\to \| x \|_{1-var, [s,t]}$ is additive, i.e for $0 \le s \le t \le u \le T$,
$\| x \|_{1-var, [s,t]}+ \| x \|_{1-var, [t,u]}= \| x \|_{1-var, [s,u]},$
and controls $x$ in the sense that for $0 \le s \le t \le T$,
$\| x(s)-x(t) \| \le \| x \|_{1-var, [s,t]}.$
The function $s \to \| x \|_{1-var, [0,s]}$ is moreover continuous and non decreasing.

Proof: If $\Pi_1 \in \Delta [s,t]$ and $\Pi_2 \in \Delta [t,u]$, then $\Pi_1 \cup \Pi_2 \in \Delta [s,u]$. As a consequence, we obtain
$\sup_{ \Pi_1 \in \Delta[s,t]} \sum_{k=0}^{n-1} \| x(t_{k+1}) -x(t_k) \| +\sup_{ \Pi_2 \in \Delta[t,u]} \sum_{k=0}^{n-1} \| x(t_{k+1}) -x(t_k) \| \le \sup_{ \Pi \in \Delta[s,u]} \sum_{k=0}^{n-1} \| x(t_{k+1}) -x(t_k) \|,$
thus
$\| x \|_{1-var, [s,t]}+ \| x \|_{1-var, [t,u]} \le \| x \|_{1-var, [s,u]}.$
Let now $\Pi \in \Delta[s,u]$:
$\Pi=\left\{ s= t_0 < t_1 < \cdots < t_n =t \right\}.$
Let $k=\max \{ j, t_j \le t\}$. By the triangle inequality, we have
$\sum_{j=0}^{n-1} \| x(t_{j+1}) -x(t_j) \|$
$\le \sum_{j=0}^{k-1} \| x(t_{j+1}) -x(t_j) \| + \sum_{j=k}^{n-1} \| x(t_{j+1}) -x(t_j) \|$
$\le \| x \|_{1-var, [s,t]}+ \| x \|_{1-var, [t,u]}.$
Taking the $\sup$ of $\Pi \in \Delta[s,u]$ yields
$\| x \|_{1-var, [s,t]}+ \| x \|_{1-var, [t,u]} \ge \| x \|_{1-var, [s,u]},$
which completes the proof. The proof of the continuity and monoticity of $s \to \| x \|_{1-var, [0,s]}$ is let to the reader $\square$

This control of the path by the 1-variation norm is an illustration of the notion of controlled path which is very useful in rough paths theory.

Definition: A map $\omega: \{ 0 \le s \le t \le T \} \to [0,\infty)$ is called superadditive if for all $s \le t \le u$,
$\omega(s,t)+\omega(t,u) \le \omega (s,u).$
If, in adition, $\omega$ is continuous and $\omega(t,t)=0$, we call $\omega$ a control. We say that a path $x:[0,T] \to \mathbb{R}$ is controlled by a control $\omega$, if there exists a constant $C < 0$, such that for every $0 \le s \le t \le T$,
$\| x(t) -x(s) \| \le C \omega(s,t).$

Obviously, Lipschitz functions have a bounded variation. The converse is of course not true: $t\to \sqrt{t}$ has a bounded variation on $[0,1]$ but is not Lipschitz. However, any continuous path with bounded variation is the reparametrization of a Lipschitz path in the following sense.

Proposition: Let $x \in C^{1-var} ([0,T], \mathbb{R}^d)$. There exist a Lipschitz function $y:[0,1] \to \mathbb{R}^d$, and a continuous and non-decreasing function $\phi:[0,T]\to [0,1]$ such that $x=y\circ \phi$.

Proof: We assume $\| x \|_{1-var, [0,T]} \neq 0$ and consider
$\phi(t)=\frac{ \| x \|_{1-var, [0,t]} }{ \| x \|_{1-var, [0,T]} }.$
It is continuous and non decreasing. There exists a function $y$ such that $x=y\circ \phi$ because $\phi(t_1)=\phi(t_2)$ implies $x(t_1)=x(t_2)$. We have then, for $s \le t$,
$\| y( \phi(t)) -y ( \phi(s)) \|=\| x(t) -x (s) \| \le \| x \|_{1-var, [s,t]} =\| x \|_{1-var, [0,T]} (\phi(t)-\phi(s) )$ $\square$

The next result shows that the set of continuous paths with bounded variation is a Banach space.

Theorem: The space $C^{1-var} ([0,T], \mathbb{R}^d)$ endowed with the norm $\| x(0) \|+ \| x \|_{1-var, [0,T]}$ is a Banach space.

Proof: Let $x^n \in C^{1-var} ([0,T], \mathbb{R}^d)$ be a Cauchy sequence. It is clear that
$\| x^n -x^m \|_\infty \le \| x^n(0)-x^m(0) \|+ \| x^n-x^m \|_{1-var, [0,T]}.$
Thus, $x^n$ converges uniformly to a continuous path $x :[0,T] \to \mathbb{R}$. We need to prove that $x$ has a bounded variation. Let
$\Pi=\{ 0=t_0
be a a subdivision of $[0,T]$. There is $m \ge 0$, such that $\| x - x^m \|_\infty \le \frac{1}{2n}$, thus
$\sum_{k=0}^{n-1} \|x(t_{k+1})-x(t_k) \|$
$\le \sum_{k=0}^{n-1} \|x(t_{k+1})-x^m(t_{k+1}) \| +\sum_{k=0}^{n-1} \|x^m(t_{k})-x(t_k) \| +\| x^m \|_{1-var,[0,T]}$
$\le 1+\sup_{n} \| x^n \|_{1-var,[0,T]}.$
Thus, we have
$\| x \|_{1-var,[0,T]} \le 1+\sup_{n} \| x^n \|_{1-var,[0,T]} < \infty$
$\square$

For approximations purposes, it is important to observe that the set of smooth paths is not dense in $C^{1-var} ([0,T], \mathbb{R}^d)$ for the 1-variation convergence topology. The closure of the set of smooth paths in the 1-variation norm, which shall be denoted by $C^{0,1-var} ([0,T], \mathbb{R}^d)$ is the set of absolutely continuous paths.

Proposition: Let $x \in C^{1-var} ([0,T], \mathbb{R}^d)$. Then, $x \in C^{0,1-var} ([0,T], \mathbb{R}^d)$ if and only if there exists $y \in L^1([0,T])$ such that,
$x(t)=x(0)+\int_0^t y(s) ds.$

Proof: First, let us assume that
$x(t)=x(0)+\int_0^t y(s) ds,$
for some $y \in L^1([0,T])$. Since smooth paths are dense in $L^1([0,T])$, we can find a sequence $y^n$ in $L^1([0,T])$ such that $\| y-y^n \|_1 \to 0$. Define then,
$x^n(t)=x(0)+\int_0^t y^n(s) ds.$
We have
$\| x-x^n \|_{1-var,[0,T]}=\| y-y^n \|_1.$
This implies that $x \in C^{0,1-var} ([0,T], \mathbb{R}^d)$. Conversely, if $x \in C^{0,1-var} ([0,T], \mathbb{R}^d)$, there exists a sequence of smooth paths $x^n$ that converges in the 1-variation topology to $x$. Each $x^n$ can be written as,
$x^n(t)=x^n(0)+\int_0^t y^n(s) ds.$
We still have
$\| x^m-x^n \|_{1-var,[0,T]}=\| y^m-y^n \|_1,$
so that $y^n$ converges to some $y$ in $L^1$. It is then clear that
$x(t)=x(0)+\int_0^t y(s) ds$
$\square$

Exercise: Let $x \in C^{1-var} ([0,T], \mathbb{R}^d)$. Show that $x$ is the limit in 1-variation of piecewise linear interpolations if and only if $x \in C^{0,1-var} ([0,T], \mathbb{R}^d)$.

This entry was posted in Rough paths theory. Bookmark the permalink.