In the study of a stochastic process it is often useful to consider some properties of the process that hold up to a random time. A natural question is for instance: How long is the process less than a given constant ?

**Definition.** *Let be a filtration on a probability space . Let be a random variable, measurable with respect to and valued in . We say that is a stopping time of the filtration if for , .*

Often, a stopping time will be the time during which a stochastic process adapted to the filtration satisfies a given property. The above definition means that for any , at time , one is able to decide if this property is satisfied or not.

Among the most important examples of stopping times, are the (first) hitting times of a closed set by a continuous stochastic process.

**Exercise*** (First hitting time of a closed set by a continuous stochastic process)
Let be a continuous process adapted to a filtration . Let*

*where is a closed subset of . Show that is a stopping time of the filtration .
*

Given a stopping time , we may define the -algebra of events that occur before the time :

**Proposition.** * Let be a stopping time of the filtration . Let
Then is a -algebra.
*

**Proof:**

Since for every , , we have that . Let us now consider . We have

and thus . Finally, if is a sequence of subsets of ,

If is a stopping time of a filtration with respect to which a given process is adapted, then it is possible to stop this process in a natural way at the time . We let the proof of the corresponding proposition as an exercise to the reader.

**Proposition.** *Let be a filtration on a probability space and let be an almost surely finite stopping time of the filtration . Let be a stochastic process that is adapted and progressively measurable with respect to the filtration . The stopped stochastic process is progressively measurable with respect to the filtration .*

We are now ready to introduce the martingales in continuous time. Such processes were first extensively studied by Joseph Doob. Together with the Markov processes, that we will study later, they are among the most important class of stochastic processes and lie at the hearth of the theory of stochastic integration.

**Definition.** *Let be a filtration defined on a probability space . A process that is adapted to is called a submartingale with respect to this filtration if:*

- For every , ;
- For every ,

*A stochastic process that is adapted to and such that is a submartingale, is called a supermartingale. Finally, a stochastic process that is adapted to and that is at the same time a submartingale and a supermartingale is called a martingale.
*

The following exercises provide some first properties of these processes.

**Exercise. (Closed martingale)**

*Let be a filtration defined on a probability space and let be an integrable and -measurable random variable. Show that the process is a martingale with respect to the filtration .
*

**Exercise.** *Let be a filtration defined on a probability space and let be a submartingale with respect to the filtration . Show that the function is non-decreasing.
*

**Exercise.** * Let be a filtration defined on a probability space and let be a martingale with respect to the filtration . Let now be a convex function such that for , . Show that the process is a submartingale.
*

Let $\{X_t\} \in \{\mathfrak{F}_t\}$ be a continuous process, $F \subset \mathbb{R}$ is closed, $T = \inf\{t \geq 0: X_t \in F\}$ be the first hitting time, then $T$ is an $\{\mathfrak{F}_t\}$-stopping time. Indeed,

\[\begin{array}{rcl}

\{T \leq t\} & = & \{\omega \in \Omega: \;\; \exists \, s \leq t \;\; \mbox{such that} \;\; X_s(\omega) \in F\} \\

& = & \displaystyle\bigcap_{i=1}^{\infty}\bigcup_{\substack{s\leq t \\ s \in \mathbb{Q}}} \{\omega \in \Omega: \;\; d(X_s(\omega), F) < \frac{1}{i} \}

\end{array}\]

\noindent and each set in the last expression is an element of $\mathfrak{F}_s$ (since the distance $d$ is a continuous function), and therefore is an element of $\mathfrak{F}_t$.