跳转至
\[ \newcommand{\bs}{\boldsymbol} \newcommand{\bsX}{\boldsymbol{X}} \newcommand{\bf}{\mathbf} \newcommand{\msc}{\mathscr} \newcommand{\mca}{\mathcal} \newcommand{\T}{\text{T}} \newcommand{\rme}{\mathrm{e}} \newcommand{\rmi}{\mathrm{i}} \newcommand{\rmj}{\mathrm{j}} \newcommand{\rmd}{\mathrm{d}} \newcommand{\rmm}{\mathrm{m}} \newcommand{\rmb}{\mathrm{b}} \newcommand{\and}{\land} \newcommand{\or}{\lor} \newcommand{\exist}{\exists} \newcommand{\sube}{\subseteq} \newcommand{\lr}[3]{\left#1 #2 \right#3} \newcommand{\intfy}{\int_{-\infty}^{+\infty}} \newcommand{\sumfy}[1]{\sum_{#1=-\infty}^{+\infty}} \newcommand{\vt}{\vartheta} \newcommand{\ve}{\varepsilon} \newcommand{\vp}{\varphi} \newcommand{\Var}{\text{Var}} \newcommand{\Cov}{\text{Cov}} \newcommand{\edef}{\xlongequal{def}} \newcommand{\prob}{\text{P}} \newcommand{\Exp}{\text{E}} \newcommand{\t}[1]{\text#1} \newcommand{\N}{\mathbb{N}} \newcommand{\Z}{\mathbb{Z}} \newcommand{\Q}{\mathbb{Q}} \newcommand{\R}{\mathbb{R}} \newcommand{\C}{\mathbb{C}} \newcommand{\versionofnewcommand}{\text{260125}} \]

1. Itô integrals

1.1 Construction of the Itô Integral

Basic Ideas

For the 1-dimensional case, the standard stochastic differential equation can be written as the form:

\[ \frac{\rmd X}{\rmd t}=b(t,X_t)+\sigma(t,X_t)\cdot W_t, \]

where \(W_t\) represents a noise term. The most common case is when \(W_t\) is a white noise, that is,

  1. \(t_1 \neq t_2\ \Rightarrow \ W_{t_1}\) and \(W_{t_2}\) are independent;
  2. \(\{W_t\}\) is stationary, i.e. the joint distribution of \(\{W_{t+t_1},\cdots,W_{t+t_k}\}\) is independent of \(t\);
  3. \(\Exp(W_t)=0\) for all \(t\).

We may first consider the discrete form:

\[ X_{k+1} - X_k = b(t_k,X_k)\Delta t_k + \sigma(t_k,X_k)W_k\Delta t_k\ , \]

where we simplified the notations by \(X_j=X(t_j),\ W_k=W_{t_k},\ \Delta t_k=t_{k+1}-t_k\). Actually, \(W_k \Delta t_k\) are stationary independent increments with mean \(0\), implying that we can replace \(W_k\Delta t_k\) by \(\Delta B_k\), where \(B_t\) is the Brownian motion. Thus,

\[ X_k = X_0 +\sum_{j=0}^{k-1}b(t_j,X_j)\Delta t_j +\sum_{j=0}^{k-1}\sigma(t_j,X_j)\Delta B_j\ . \]

The major task to bridge the gap between discrete and continuous equations is the stochastic term taking the form

\[ \int_S^T f(t,\omega)\ \rmd B_t(\omega)\ , \]

where \(0\leq S\leq T\), \(B_t(\omega)\) is 1-dimensional Brownian motion starting at the origin, for a wide class of functions \(f:[0,\infty]\times\Omega \to \R\). Intuitively, we want to analyze it in a manner similar to Riemann-Stieltjes integral. We write

\[ f(t,\omega) = \sum_j f(t_j^*,\omega)\cdot\bf{I}_{[t_j,t_{j+1})}(t)\ , \]

where \(\bf{I}\) are indicator functions. However, the result would strongly depend on the points \(t_j^*\) due to huge variations of the paths of \(B_t\), unlike the Riemann-Stieltjes integral. In Itô integral, we choose \(t_j^*\) to be the left end point \(t_j\). This approximation will work out successfully provided that \(f\) has the property that each of the functions \(\omega \to f(t_j,\omega)\) only depends on the behaviour of \(B_s(\omega)\) up to time \(t_j\). This leads to the following important concepts:

**Definition 1.1.1 (Natural Filtration Generated by Brownian Motion): **

Let \(B_t(\omega)\) be \(n\)-dimensional Brownian motion. Then we define \(\msc{F}_t = \msc{F}_t^{(n)}\) to be the \(\sigma\)-algebra generated by the random variables \(B_s(\cdot),\ s\leq t\). That is, \(\msc{F}_t\) is the smallest \(\sigma\)-algebra containing all sets of the form

\[ \{B_{t_1}(\omega)\in F_1,\cdots,B_{t_k}(\omega)\in F_k \ |\ \omega\}\ , \]

where \(t_j \leq t\) and \(F_j \sube \R^n\) are Borel sets, \(j\leq k=1,2,\cdots\). And we further assume that \(\msc{F}_t\) is completed by including all sets of outer-measure zero. In order to give an intuitive interpretation, we can imagine a particle which locates in region \(F_j\) at time \(t_j\), thus \(\msc{F}_t\) contains all the basic states of the particle. We say that \(\msc{F}_t\) is the natural filtration generated by Brownian motion, where the term "filtration" refers to a sequence of \(\sigma\)-algebra expanding with index of time, that is, \(\msc{F}_s\sube \msc{F}_t\) for \(s<t\). By the definition, \(\msc{F}_t\) can be interpreted as the history of \(B_s\) up to time \(t\). We will give a more specific definition of filtration in the following sections.

Proposition 1.1.2:

A function \(h(\omega)\) will be \(\msc{F}_t\)-measurable if and only if \(h\) can be written as the pointwise a.e. limit of sums of functions of the form

\[ g_1(B_{t_1})g_2(B_{t_2})\cdots g_k(B_{t_k}) \]

where \(g_1,\cdots,g_k\) are bounded continuous functions and \(t_j\leq t\) for \(j\leq k=1,2,\cdots\). Intuitively, that \(h\) is \(\msc{F}_t\)-measurable means that the value \(h(\omega)\) can be decided from the values of \(B_s(\omega)\) for \(s \leq t\) (the history).

Definition 1.1.3 (Filtration-Adapted Process):

Here we extend the conceptions above to more general cases, from Brownian motion \(B_t(\omega)\) to any stochastic process \(g(t,\omega)\), and from the natural auto-generated filtration to any filtration (information flow).

Let \(\{ \msc{N}_t \}_{t \geq 0}\) be a filtration, i.e. an increasing family of \(\sigma\)-algebras of subsets of \(\Omega\). A process

\[ g(t,\omega):[0,\infty)\times\Omega\to\R^n \]

is called \(\msc{N}_t\)-adapted if for each $t\geq 0 $ the function

\[ \omega \to g(t,\omega) \]

is \(\msc{N}_t\)-measurable.

Construct Itô Integrals Step by Step

**Definition 1.1.4: **

Let \(\msc{V}=\msc{V}(S,T)\) be the class of functions

\[ f(t,\omega):[0,\infty)\times\Omega\to\R \]

such that

  1. \((t,\omega)\to f(t,\omega)\) is \(\msc{B}\times{\msc{F}}\)-measurable, where \(\msc{B}\) denotes the Borel \(\sigma\)-algebra on \([0,\infty)\).
  2. \(f(t,\omega)\) is \(\msc{F}_t\)-adapted.
  3. \(\displaystyle \Exp\lr[{\int_{S}^T f(t,\omega)^2\ \rmd t}]<\infty\).

For functions \(f\in\msc{V}\) we will show how to define the Itô integral. Firstly, we define

\[ \mathcal{I}[f](\omega)=\int_S^T f(t,\omega)\ \rmd B_t(\omega)\ , \]

where \(B_t\) is 1-dimensional Brownian motion. We begin the construction with simple functions, which is a natural idea. We say that a function \(\phi \in \msc{V}\) is elementary if it has the form

\[ \phi(t,\omega)=\sum_j e_j(\omega)\cdot I_{[t_{j},t_{j+1})}(t)\ , \]

where \(I\) is a indicator function and \(e_j\) is some arbitrary function. Note that since \(\phi\in\msc{V}\) each function \(e_j\) must be \(\msc{F}_{t_j}\)-measurable. Thus we define the integral

\[ \int_S^T \phi(t,\omega)\ \rmd B_t(\omega) = \sum_{j\geq 0} e_j(\omega)[B_{t_{j+1}}-B_{t_j}](\omega)\ . \]

Now we make the following important observation:

Lemma (The Itô isometry) 1.1.5:

If \(\phi(t,\omega)\) is bounded and elementary, then

\[ \Exp \lr[{\lr({\int_S^T\phi(t,\omega)\ \rmd B_t(\omega)})^2}] = \Exp \lr[{\int_S^T \phi(t,\omega)^2\ \rmd t}]\ . \]

​ **Proof: ** Let \(\Delta B_j = B_{t_{j+1}}-B_{t_j}\). Then,

\[ \Exp \lr[{\lr({\int_S^T\phi\ \rmd B})^2}] = \Exp \lr[{\sum_{i,j} e_ie_j\Delta B_i\Delta B_j }]=\sum_{i,j}\Exp \lr[{e_ie_j\Delta B_i\Delta B_j }] \]

​ Since \(\Exp \lr[{e_ie_j\Delta B_i\Delta B_j }]=0\) for \(i\neq j\), we only need to consider the cases when \(i=j\). Moreover, \(e_j\) and the Brownian increment \(\Delta B_j\) are independent since \(\Delta B_j\) depends on the behavior of the process after time \(t_j\) while \(e_j\) depends on the history up to \(t_j\). Therefore

\[ \begin{aligned} \sum_{i,j}\Exp \lr[{e_ie_j\Delta B_i\Delta B_j }] = &\sum_{j} \Exp \lr [ { {e_j}^2 {(\Delta B_j) }^2} ] = \sum_{j} \Exp \lr [ { {e_j}^2 } ]\Exp \lr [ {{(\Delta B_j) }^2} ] \\ = &\sum_j \Exp \lr [ { {e_j}^2 } ] {(t_{j+1}-t_j)} = \int_S^T \Exp[{\phi}^2]\ \rmd t \\ = & E\lr[{\int_S^T \phi^2 \ \rmd t}]\ , \end{aligned} \]

​ where \(\Exp\lr[{\lr({\Delta B_j})^2}]=t_{j+1}-t_j\) for that the Brownian increment obeys the normal distribution \(\mathcal{N}(0,t_{j+1}-t_j)\).

Now we try to extend the definition of the integral from elementary functions to functions in \(\msc{V}\).

Proposition 1.1.6:

Let \(g\in\msc{V}\) be bounded and \(g(\cdot,\omega)\) continuous for each \(\omega\). Then there exist elementary functions \(\phi_n\in\msc{V}\) such that

\[ \Exp\lr[{\int_S^T (g-\phi_n)^2\ \rmd t}] \to 0 \quad \text{as}\ n\to 0 \ . \]

Proposition 1.1.7:

Let \(h\) be bounded. Then there exist bounded functions \(g_n\in\msc{V}\) such that \(g_n(\cdot,\omega)\) is continuous for all \(\omega\) and \(n\), and

\[ \Exp\lr[{\int_S^T (h-g_n)^2\ \rmd t}] \to 0 \quad \text{as}\ n\to 0 \ . \]

Proof: Suppose \(|h(t,\omega)|\leq M\) for all \((t,\omega)\). For each \(n\), let \(\psi_n\) be a non-negative, continuous function on \(\R\) such that

​ (i). \(\psi_n(x)=0\) for \(x\leq -1/n\) and \(x\geq 0\) ;

​ (ii). \(\intfy \psi_n(x)\ \rmd x = 1\) .

​ Define

\[ g_n(t,\omega) = \int_0^t \psi_n(s-t)g(s,\omega)\ \rmd s\ . \]

​ Then \(g_n(\cdot,\omega)\) is continuous for each \(\omega\) and \(|g_n(t,\omega)|\leq M\). Since \(h\in\msc{V}\), we can show that \(g_n(t,\omega)\) is \(\msc{F}_t\)-measurable for all \(t\). Moreover,

\[ \int_S^T (g_n(s,\omega)-h(s,\omega))^2\ \rmd s\to 0\quad \text{as}\ n\to 0 \ , \]

​ since \(\set{\psi_n}_n\) constitutes an approximate identity. So by bounded convergence we have

\[ \Exp\lr[{\int_S^T (h(t,\omega)-g_n(t,\omega))^2\ \rmd t}] \to 0 \quad \text{as}\ n\to 0 \ . \]

Proposition 1.1.8:

Let \(f\in\msc{V}\). Then there exists a sequence \(\set{h_n}\sube \msc{V}\) such that \(h_n\) is bounded for each \(n\) and

\[ \Exp\lr[{\int_S^T (f-h_n)^2\ \rmd t}] \to 0 \quad \text{as}\ n\to 0 \ . \]

Now, for \(f\in\msc{V}\), we can choose a sequence of elementary functions \(\set{\phi_n}\sube\msc{V}\) such that

\[ \Exp\lr[{\int_S^T |f-\phi_n|^2\ \rmd t}] \to 0 \quad \text{as}\ n\to 0 \ . \]

Thus we define

\[ \mathcal{I}[f](\omega)=\int_S^T f(t,\omega)\ \rmd B_t(\omega) = \lim_{n\to\infty} \int_S^T \phi_n(t,\omega)\ \rmd B_t(\omega)\ . \]

We summarize as follows:

Definition 1.1.9 (The Itô Integral):

Let \(f\in\msc{V}(S,T)\). Then the Itô integral of \(f\) is defined by

\[ \int_{S}^{T} f(t,\omega)\ \rmd B_t(\omega) = \lim_{n\to \infty} \int_S^T \phi_n(t,\omega)\ \rmd B_t(\omega)\ , \]

where \(\{\phi_n\}\) is a sequence of elementary functions such that

\[ \Exp\lr[{\int_S^T (f(t,\omega)-\phi_n(t,\omega))^2\ \rmd t}] \to 0 \quad \text{as}\ n\to\infty\ . \]

This definition is hard to be put in practice, however. Some useful formula or properties will be introduced in following chapters.

Corollary 1.1.10 (The Itô Isometry):

\[ \Exp \lr[{\lr({\int_S^T f(t,\omega)\ \rmd B_t(\omega)})^2}] = \Exp \lr[{\int_S^T f(t,\omega)^2\ \rmd t}]\ ,\quad \forall f \in \msc{V}(S,T)\ . \]

Corollary:

If \(f(t,\omega)\in\msc{V}(S,T)\) and \(f_n(t,\omega)\in\msc{V}(S,T)\) for \(n=1,2,\cdots\) and

\[ \Exp\lr[{\int_S^T (f_n(t,\omega)-f(t,\omega))^2\ \rmd t}] \to 0 \quad \text{as}\ n\to\infty\ , \]

then

\[ \int_S^T f_n(t,\omega)\ \rmd B_t(\omega) \to \int_S^T f(t,\omega)\ \rmd B_t(\omega) \quad \text{in } L^2(P)\ \text{as } n \to \infty\ . \]

**Corollary: **

\[ \int_0^t B_s\ \rmd B_s = \frac{1}{2}({B_t}^2-{B_0}^2)-\frac{1}{2}t\ . \]

Proof: First we construct a sequence to approximate \(B_s\). Let

\[ \phi_n(t,\omega)=\sum_j B_j(\omega)\cdot I_{[t_{j},t_{j+1})}(t)\ . \]

​ We need to check whether this \(\phi_n\) is truly an approximation of \(B_s\):

\[ \begin{aligned} \Exp\lr[{\int_0^t (\phi_n-B_s)^2\ \rmd s}] &= \Exp \lr [{\int_0^t \lr({\sum_j B_j(\omega)\cdot I_{[t_{j},t_{j+1})}(t)-B_s})^2 \ \rmd s}]\\ &= \Exp \lr [{\sum_j \lr({ \int_{t_j}^{t_{j+1}} (B_j-B_s)^2\ \rmd s })}]\\ &= \sum_j\lr[{\int_{t_j}^{t_{j+1}}\Exp(B_j-B_s)^2\ \rmd s}]\\ &= \sum_j\lr[{\int_{t_j}^{t_{j+1}}(s-t_j)\ \rmd s }]\\ &= \sum_j\frac{1}{2}(t_{j+1}-t_j)^2 \to 0 \quad \text{as } \Delta t_j\to 0\ . \end{aligned} \]

​ Thus,

\[ \int_0^t B_s\ \rmd B_s = \lim_{\Delta t_j \to 0} \int_0^t \phi_n\ \rmd B_n = \lim_{\Delta t_j\to 0} \sum_j B_j\ \Delta B_j\ . \]

​ By intuition, the crossing term \(B_j\ \Delta B_j\) can be derived from the relationship between \(\Delta (B_j)^2\) and \((\Delta B_j)^2\).

\[ \begin{aligned} &\Delta(B_j)^2 = {B_{j+1}}^2 - {B_{j}}^2 = (B_{j+1}-B_j)^2 + 2 B_j (B_{j+1}-B_j) = (\Delta B_j)^2 + 2 B_j\ \Delta B_j\ .\\ \\ \Rightarrow\ &{B_t}^2 - {B_0}^2 = \sum_{j} \Delta(B_j)^2 = \sum_j (\Delta B_j)^2 + 2\sum_j B_j\ \Delta B_j \\ \Leftrightarrow\ & \sum_j B_j\ \Delta B_j = \frac{1}{2}({B_t}^2 - {B_0}^2)-\frac{1}{2}\sum_j (\Delta B_j)^2 \end{aligned} \]

​ What is the behaviour, in \(L^2(P)\) particularly, of \(\sum_j(\Delta B_j)^2\) under limit \(\Delta t_j\to 0\)? By intuition it seems to be \(t\) since

\[ \Exp\lr({\sum_j\lr({\Delta B_j})^2}) = \sum_j \Exp\lr({\Delta B_j})^2 = \sum_j (t_{j+1}-t_j) = t\ . \]

​ Thus we need to prove that

\[ \lim_{\Delta t_j\to 0}\Exp\lr[{\lr({\sum_j\Delta B_j-t})^2}]=0\ . \]

​ We achieve this by calculating the variation of \(\sum_j\lr({\Delta B_j})^2\), noting it by \(Q_n\) for simplification.

\[ \Var(Q_n) = \Exp(Q_n-\Exp(Q_n))^2 = \Exp(Q_n - t)^2\ . \]

​ Since Brownian increments are independent,

\[ \Var(Q_n)=\Var\lr({\sum_j(\Delta B_j)^2}) = \sum_j \Var(\Delta B_j)^2\ . \]

​ Note that \(\Delta B_j\) obeys the normal distribution. And \(\Var(\Delta B_j)^2 = \Exp(\Delta B_j)^4 - (\Exp(\Delta B_j)^2)^2\). Recall that for a random variable \(X\) we can calculate its moment \(\Exp(X^k)\) by taking \(k\)-order derivative of its eigen-function at \(t=0\).

$$ \begin{aligned}

    &\Exp[\exp(\rmi t X)]  = \Exp\lr({\sum_{m=0}^\infty \frac{\rmi^m X^m}{m!}t^m}) = 
    \sum_{m=0}^\infty \frac{(\rmi t)^m}{m!}\Exp(X^m) \\
    \Rightarrow\quad  \frac{\rmd^k}{\rmd t^k}\ & \Exp[\exp(\rmi t X)]\bigg|_{t=0}  =  \Exp(X^m)
    \ .

\end{aligned} $$

​ The eigen-function of \(\Delta B_j\) is \(\phi(s)=\exp(-s^2\cdot\Delta t_j/2)\) since \(\Delta B_j \sim \mathcal{N}(0,\Delta t_j)\). Thus \(\Exp(\Delta B_j)^2 = \Delta t_j\) and \(\Exp (\Delta B_j)^4 = 3(\Delta t_j)^2\), and then we have

\[ \Exp(Q_n - t)^2=\Var(Q_n) = \sum_j \Var(\Delta B_j)^2 = \sum_j 2(\Delta t_j)^2 \to 0\quad \text{as }\Delta t_j\to 0 \ . \]

​ Therefore we have proved that \(\sum_j (\Delta B_j)^2\to t\) in \(L^2(P)\) sense. Finally,

$$ \begin{aligned} \int_0^t B_s\ \rmd B_s &= \lim_{\Delta t_j\to 0} \sum_j B_j\ \Delta B_j \ &= \frac{1}{2}({B_t}^2 - {B_0}^2)-\lim_{\Delta t_j\to 0} \frac{1}{2}\sum_j (\Delta B_j)^2\ &= \frac{1}{2}({B_t}^2 - {B_0}^2)-\frac{1}{2}t

\end{aligned} $$

1.2 Some Properties of the Itô integral

Theorem 1.2.1:

Let \(f,g\in\msc{V}(0,T)\) and let \(0 \leq S < U < T\). Then

  1. \(\displaystyle \int_S^T f\ \rmd B_t = \int_S^U f \ \rmd B_t + \int_U^T f \ \rmd B_t\) for a.a.\(\omega\) .
  2. \(\displaystyle \int_S^T (cf+g)\ \rmd B_t = c\cdot\int_S^T f\ \rmd B_t + \int_S^T g\ \rmd B_t\) for a.a.\(\omega\), where \(c\) is a constant.
  3. \(\displaystyle \Exp\lr[{\int_S^T f\ \rmd B_t}] = 0\) .
  4. \(\displaystyle \int_S^T f\ \rmd B_t\) is \(\msc{F}_T\)-measurable.

The Itô integral is a martingale

**Definition 1.2.2: **

A filtration on \((\Omega,\msc{F})\) is a family \(\msc{M} = \set{\msc{M}_t}_{t\geq 0}\) of \(\sigma\)-algebras \(\msc{M}_t\sube \msc{F}\) such that

\[ 0\leq s<t \quad \Rightarrow \quad \msc{M}_s\sube \msc{M}_t\ . \]

An \(n\)-dimensional stochastic process \(\set{{M}_t}_{t\geq 0}\) on \((\Omega,\msc{F},\prob)\) is called a martingale with respect to (w.r.t.) a filtration \(\set{\msc{M}_t}_{t\geq 0}\) if

  1. \(M_t\) is \(\msc{M}_t\)-measurable for all \(t\),
  2. \(\Exp(|M_t|)<\infty\) for all t,
  3. \(\Exp(M_s\ |\ \msc{M}_t)= M_t\) for all \(s \geq t\).

That is to say the expectation of a martingale at time \(s\) under the knowledge of its history up to time \(t\), where \(s\geq t\), is exactly the martingale itself at time \(t\). In another word, the knowledge of the future would not make any contribution

Proposition 1.2.3:

Brownian motion \(B_t\) in \(\R^n\) is a martingale w.r.t. the \(\sigma\)-algebras \(\msc{F}_t\) generated by \(\set{B_s\ |\ s \leq t}\).

Proof: We have known that \((B_s-B_t)\) is independent of \(\msc{F}_t\) and \(\Exp(B_t\ |\ \msc{F}_t)=B_t\) if \(B_t\) is \(\msc{F}_t\)-measurable.

$$ \begin{aligned} \Exp(B_s\ |\ \msc{F}_t) = &\ \Exp (B_s - B_t +B_t\ |\ \msc{F}_t)\ = &\ \Exp(B_s-B_t\ |\ \msc{F}_t) +\Exp(B_t\ |\ \msc{F}_t)\ = &\ \Exp(B_s-B_t)+B_t= 0+ B_t = B_t\ .

\end{aligned} $$

Theorem 1.2.4 (Doob's Martingale Inequality):

If \(M_t\) is a martingale such that \(t\to M_t(\omega)\) is continuous a.s., then for all \(p \geq 1\), \(T \geq 0\) and all \(\lambda >0\)

\[ \prob \lr({\sup_{0 \leq t \leq T} |M_t| \geq \lambda})\leq \frac{1}{\lambda^p}\cdot\Exp(|M_T|^p)\ . \]

**Theorem 1.2.5: **

Let \(f\in\msc{V}(0,T)\). Then there exists a \(t\)-continuous version of

\[ \int_0^T f(s,\omega)\ \rmd B_s(\omega)\ ,\quad 0 \leq t \leq T\ , \]

i.e. there exists a \(t\)-continuous stochastic process \(J_t\) on \((\Omega,\msc{F},\prob)\) such that

\[ \prob\lr({J_t = \int_0^T f\ \rmd B}) = 1 \quad \forall\ t\ , 0\leq t \leq T\ . \]

From now on we shall always assume that \(\int_0^t f(s,\omega)\ \rmd B_s(\omega)\) means a \(t\)-continuous version of the integral.

**Corollary 1.2.6: **

Let \(f\in(t,\omega)\in\msc{V}(0,T)\) for all \(T\). Then

\[ M_t(\omega) = \int_0^t f(s,\omega)\ \rmd B_s \]

is a martingale w.r.t. \(\msc{F}_t\) and

\[ \prob\lr({\sup_{0 \leq t \leq T} |M_t|\geq\lambda}) \leq \frac{1}{\lambda^2} \cdot \Exp \lr({\int_0^T f(s,\omega)^2\rmd s})\ . \]

1.3 Extensions of the Itô Integral

The Itô integral can be defined for a larger class of integrands \(f\) than \(\msc{V}\). First, the condition that \(f\) is \(\msc{F}_t\)-measurable can be relaxed to the following: There exists an increasing family of \(\sigma\)-algebras \(\msc{H}_t\) (\(t>0\)) such that \(B_t\) is a martingale with respect to \(\msc{H}_t\) , and \(f_t\) is \(\msc{H}_t\)-adapted. Note that \(\msc{F}_t\sube \msc{H}_t\) since \(B_t\) is a martingale w.r.t. \(\msc{F}_t\), where \(\msc{F}_t\) is the \(\sigma\)-algebras generated by \(\set{B_s\ |\ s\leq t}\), thus it is "the smallest \(\msc{H}_t\)".

The most important example is the following \(n\)-dimensional case: Suppose \(B_k(t,\omega)\) is the \(k\)-th coordinate of \(n\)-dimensional Brownian motion \((B_1,\cdots,B_n)^\T\). Let \(\msc{F}_t^{(n)}\) be the \(\sigma\)-algebra generated by multiple Brownian motions \(B_1(s_1,\cdot),\cdots,B_n(s_n,\cdot)\) \((s_k\leq t)\). Then \(B_k(t,\omega)\) is a martingale w.r.t. \(\msc{F}_t^{(n)}\). Therefore we can define the multi-dimensional Itô integral.

Definition 1.3.1:

Let \(B = (B_1,\cdots, B_n)^\T\) be \(n\)-dimensional Brownian motion. Then \(\msc{V}_\msc{H}^{m \times n}(S,T)\) denotes the set of \(m\times n\) matrices \(v = [v_{ij}(t,\omega)]\) where each entry \(v_{ij}(t,\omega)\) satisfies the (extended) conditions required for Itô integral with respect to some filtration \(\msc{H} = \set{\msc{H}_t}_{t\geq 0}\). Therefore we define the multi-dimensional Itô integral as follow:

\[ \int_S^T v\ \rmd B = \int_S^T \begin{pmatrix} v_{11} & \cdots & v_{1n} \\ \vdots & & \vdots \\ v_{m1} & \cdots & v_{mn} \end{pmatrix} \begin{pmatrix} \rmd B_1\\ \vdots \\ \rmd B_n \end{pmatrix}\ . \]

If \(\msc{H} = \set{\msc{F_t^{(n)}}}_{t\geq 0}:=\msc{F}^{(n)}\), we simply write \(\msc{V}^{m\times n}(S,T)\) since \(\msc{F}^{(n)}\) is of the most commonly used filtrations. And we also put

\[ \msc{V}^{m\times n} = \msc{V}^{m\times n}(0,\infty) = \bigcap_{T>0}\msc{V}^{m \times n} (0,T)\ . \]

**Definition 1.3.2: **

Actually the third condition for Itô integral can also be extended to a weaker one:

\[ \prob\lr[{\int_{S}^T f(t,\omega)^2\ \rmd t<\infty }] = 1\ . \]

For this case, we let \(\msc{W}_\msc{H}(S,T)\) denote the class of process \(f(t,\omega)\) satisfying the weak conditions, and \(\msc{W_H}^{m\times n}(S,T)\) for matrix cases.