2

Question

Let $M$ be a continuous local martingale with null at zero. Let $\tau_n=\inf\{t:|M_t|>n\}$ be a stopping time. Does $M^{\tau_n}\to M$ u.c.p (uniformly on compacts in probability)?

Here, a sequence of random processes $(M^n)_{n}$ is said to converge u.c.p to $M$ if

$$P(\sup_{s\in[0,t]} |M^n_s - M_s|>\epsilon)\to 0$$

for any $t>0$ and $\epsilon>0$.

Background

When defining the stochastic integral $H\cdot M$ wrt a local martingale $M$, the standard procedure is to first define the integral wrt martinagles and then use $H \cdot M^{T(n)}\to H\cdot M$ to define $H\cdot M$ (see, for example, (30.7, vol-2) of Rojas and Williams). The above definition only makes sense if we have the convergence $M^{T(n)}\to M$.

2 Answers2

3

You have $\sup_{s\in[0,t]}|M^{\tau_n}_s-M_s|=0$ on the event $\{\tau_n\ge t\}$, and so $$ P\left(\sup_{s\in[0,t]}|M^{\tau_n}_s-M_s|>\epsilon\right)\le P(\tau_n<t)\to 0,\quad n\to\infty, $$ because $\tau_n$ increases to $+\infty$ a.s.

John Dawkins
  • 29,845
  • 1
  • 23
  • 39
0

I supplement some details to Dawkins' answer.

(1) Show that $\tau_n\to \infty$ almost surely.

Throughout, fix some $\omega\in \Omega$. When $M$ is bounded, that is, $\exists K>0, \forall t>0, |M(t,\omega)|<K$. For this case, we have $\forall n>K$, $$\tau_n(\omega)=\inf\{t>0:|M(t,\omega)|>n\}=\inf \emptyset=\infty$$

When $M$ is unbounded, define

$$\tau_n^k(\omega)=\inf \{k\geq t>0:|M(t,\omega)|>n\}$$

Since $M$ is continuous with null at zero, $t\mapsto M(t,\omega)$ is a bounded function in $(0,k]$. Applying the previous result, we have $\lim_{n\to \infty} \tau^k_n(\omega)=\infty$ for each $k$. We then have

$$\lim_{n\to \infty} \tau_n(\omega)=\lim_{n\to \infty}\lim_{k\to\infty}\tau^k_n(\omega)=\lim_{k\to \infty}\lim_{n\to\infty}\tau^k_n(\omega)=\lim_{k\to \infty} \infty = \infty,$$

where we can switch the two limits because $\tau_n^k(\omega)$ is increasing in both $n$ and $k$, according to this result.

(2) $\tau_n \to \infty$ a.s. implies $\tau_n\to \infty$ in probability.

Recall that a sequence of random variables $X_n$ is said to diverges to infinity in probability if

$$\forall K>0, \lim_{n\to \infty} P(X_n>K)=1$$

A sequence of random variables $X_n$ is said to diverges to infinity almost surely if

$$P(\omega\in\Omega:\lim_{n\to \infty} X_n(\omega)=\infty)=1$$

In other words,

$$\forall K>0, P(\omega\in\Omega:X_n(\omega)>K \,\, as \,\, n\to\infty)=1$$

Now, we use Fatou's lemma to show the claim. Specifically, let $A=(K,\infty]$, we have

$$\lim\inf P(X_n>K)=\lim\inf E[1_A(X_n)]\geq E[\lim\inf 1_A(X_n)]=E[1_A(\lim\inf X_n)]=E[1_A(\infty)]=P(\infty \in A)=1$$

Therefore, we have $\lim P(X_n>K)=1$, which means $X_n\to \infty$ a.s. implies $X_n \to \infty$ in probability.